-
Notifications
You must be signed in to change notification settings - Fork 119
Closed
Labels
Description
Describe the bug
I added a local ollama LLM Provider in the settings following the documentation adding process. The problem is that when I try to chat with any custom model it says "No response generated" and I don't know where to look and check URL.
Also, I used a python code to access http://localhost:11434/v1/completions and it works, but when using aperag it doesn't even appear in the terminal with ollama running.
Screenshots & Logs
If applicable, add screenshots to help explain your problem.
Additional context
- OS: Windows 11 - Docker
- Browser: Brave
Reactions are currently unavailable