-
Notifications
You must be signed in to change notification settings - Fork 25
Description
We're running into a limitation with the number of LLM models that can be displayed in the UI.
Actual Behavior: Currently, we can only get a maximum of two model/provider options to appear in the UI at any time. This limit seems to be:
Ollama (all locally installed or cloud models)
PLUS one other LLM block (e.g., OpenAI, Anthropic, etc.)
We have tried enabling more LLM blocks in the .env file, but this does not seem to have any effect—the UI still only shows two options.
Expected Behavior / Feature Request: We would like to request an increase in this limit. It would be extremely helpful for our workflow to be able to see and select from at least five different LLM providers/models directly in the UI.
Is this the intended behavior? If it's intended, please consider this a feature request to increase the number of active LLM blocks.
-Parthasarathi Mukhopadhyay