How to use Natural Language Categorization & Search with Ollama as backend? #906
-
|
I added the following variables to use Ollama as backend, but nothing happened. Note that Ollama doesn't use API keys for using local models. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 6 replies
-
|
First, check what errors are recorded in the log.log file in the root directory. |
Beta Was this translation helpful? Give feedback.
-
|
Maybe you can try entering any random value into the API key field first to allow some of the validation logic to run properly. |
Beta Was this translation helpful? Give feedback.
-
|
I've added a lot of logging. Please try it again now. |
Beta Was this translation helpful? Give feedback.
The
OPENAI_BASE_URLyou provided is incorrect. If it were correct, you should be able to call the endpoint{base_url}/chat/completions(the API needs to be OpenAI-compatible). For example, with OpenRouter, it would behttps://openrouter.ai/api/v1.