Local LLMs
#459
Replies: 2 comments
-
|
Yes it is possible you can use local models with ollama. Like glm.. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@ollxel feel free to reach out if you run into issues or need support. Every question is a chance to learn, so don’t hesitate! 🙂 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Is it possible to use PicoClaw with local LLMs via Ollama, without API?
P.S. I'm sorry if this is a dumb question, I'm just new to this topic.
Beta Was this translation helpful? Give feedback.
All reactions