Skip to content

Conversation

@Sherrbethead
Copy link

@Sherrbethead Sherrbethead commented Jan 26, 2026

Summary

API key support for the Ollama provider required for cloud models: https://docs.ollama.com/api/authentication

Changes

Exclude Ollama from the keyless providers. It won't hurt local setups.

Type of change

  • Bug fix
  • Feature
  • Refactor
  • Documentation
  • Chore/CI

Affected areas

  • Core (Go)
  • Transports (HTTP)
  • Providers/Integrations
  • Plugins
  • UI (Next.js)
  • Docs

How to test

# Core/Transports
go version
go test ./...

# UI
cd ui
pnpm i || npm i
pnpm test || npm test
pnpm build || npm run build

Breaking changes

  • Yes
  • No

Related issues

#1075 introduces the custom provider implementation for Ollama that includes API key support.
Although the PR seems abandoned, this interim solution quickly addresses the issue by using OpenAI-compatible methods.

Checklist

  • I read docs/contributing/README.md and followed the guidelines
  • I added/updated tests where appropriate
  • I updated documentation where needed
  • I verified builds succeed (Go and UI)
  • I verified the CI pipeline passes locally if applicable

@akshaydeo
Copy link
Contributor

❤️ for the PR @Sherrbethead - can you please update the docs

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 26, 2026

📝 Walkthrough

Walkthrough

Updated version-control ignores, changed provider key requirement logic (Ollama now requires an API key; SGL remains keyless), and expanded Ollama documentation to cover local/cloud inference, BaseURL for cloud endpoints, API key guidance, and self-hosting notes.

Changes

Cohort / File(s) Summary
Version Control Configuration
/.gitignore
Added an ignore rule for the .idea directory to exclude IDE/project metadata.
Provider Authentication Logic
core/utils.go
Modified providerRequiresKey so only SGL is keyless; Ollama now requires an API key. Updated explanatory comment.
Documentation
docs/providers/supported-providers/ollama.mdx
Expanded Ollama docs to mention local and cloud inference, added cloud BaseURL guidance and API key authentication notes, and clarified self-hosted vs cloud access.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 A hop, a tweak, a tiny spark,
.idea tucked away in the dark,
Ollama asks for a key so bright,
Cloud and local now in sight.
I nibble edits, fluffy and light.

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title 'hotfix: ollama api key for cloud models' clearly and concisely summarizes the main change: adding API key support to Ollama for cloud models.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Description check ✅ Passed The PR description follows the template structure and includes most required sections with clear explanations of changes, testing steps, and checklist items.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@core/utils.go`:
- Around line 83-89: providerRequiresKey now treats Ollama like a keyed provider
which breaks local Ollama instances; update providerRequiresKey (or add a small
helper it calls) to detect local Ollama by checking providerKey ==
schemas.Ollama and the customConfig.BaseURL (or another available base URL
field) for localhost/127.0.0.1 (and http://localhost:11434) and return false for
those local cases so getAllSupportedKeys won’t require a key; ensure you
reference providerRequiresKey and getAllSupportedKeys in your change and add a
unit test covering the local Ollama flow.

Comment on lines +83 to +89
// Some providers like SGL are keyless and don't require API keys.
func providerRequiresKey(providerKey schemas.ModelProvider, customConfig *schemas.CustomProviderConfig) bool {
// Keyless custom providers are not allowed for Bedrock.
if customConfig != nil && customConfig.IsKeyLess && customConfig.BaseProviderType != schemas.Bedrock {
return false
}
return providerKey != schemas.Ollama && providerKey != schemas.SGL
return providerKey != schemas.SGL
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# Find all call sites of providerRequiresKey to see how the return value is used
rg -n 'providerRequiresKey' -g'*.go' -A 5 -B 2

Repository: maximhq/bifrost

Length of output: 2100


🏁 Script executed:

# Search for Ollama-specific handling, especially around localhost or local instances
rg -n 'Ollama|ollama' -g'*.go' -A 3 -B 1 | head -100

Repository: maximhq/bifrost

Length of output: 6791


🏁 Script executed:

# Look for custom provider configuration and keyless handling
rg -n 'CustomProviderConfig|IsKeyLess' -g'*.go' -A 2 -B 1

Repository: maximhq/bifrost

Length of output: 36912


🏁 Script executed:

# Check if there's any localhost or local endpoint detection for Ollama
rg -n 'localhost|127\.0\.0\.1|baseURL' -g'*.go' | grep -i ollama

Repository: maximhq/bifrost

Length of output: 166


Verify local Ollama still works without requiring API keys.

With providerRequiresKey now returning true for Ollama (no longer treated as keyless like SGL), standard Ollama instances—especially local setups using http://localhost:11434—will fail when getAllSupportedKeys() is called unless keys are provided. A custom provider configured with is_key_less: true can bypass this, but that requires users to reconfigure their setup. Either:

  1. Add a targeted keyless path for local Ollama (e.g., BaseURL pattern matching for localhost or 127.0.0.1), or
  2. Update tests and documentation to clarify that local Ollama must now use a custom keyless provider, or
  3. Provide a test case covering the local Ollama flow to ensure it continues to work.
🤖 Prompt for AI Agents
In `@core/utils.go` around lines 83 - 89, providerRequiresKey now treats Ollama
like a keyed provider which breaks local Ollama instances; update
providerRequiresKey (or add a small helper it calls) to detect local Ollama by
checking providerKey == schemas.Ollama and the customConfig.BaseURL (or another
available base URL field) for localhost/127.0.0.1 (and http://localhost:11434)
and return false for those local cases so getAllSupportedKeys won’t require a
key; ensure you reference providerRequiresKey and getAllSupportedKeys in your
change and add a unit test covering the local Ollama flow.

@Sherrbethead
Copy link
Author

❤️ for the PR @Sherrbethead - can you please update the docs

@akshaydeo updated the Ollama docs.

@Sherrbethead Sherrbethead force-pushed the hotfix-ollama-api-key branch from f476d0f to e04bdb3 Compare January 26, 2026 16:05
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@docs/providers/supported-providers/ollama.mdx`:
- Around line 37-38: Update the example BaseURL values to point to the API
paths: change the cloud example from `https://ollama.com` to
`https://ollama.com/api` and change the local example from
`http://localhost:11434` to `http://localhost:11434/api`; keep the note that
cloud requests require an API key (Bearer token) and local REST API access does
not require authentication, and ensure the text references the `BaseURL` setting
so readers know which config to update.
- Around line 9-13: Update the Ollama docs so the cloud endpoint references the
full API base path: replace the phrase "cloud models via ollama.com" with "via
the Ollama Cloud service (https://ollama.com/api)" and change the BaseURL
example that currently uses "https://ollama.com" to "https://ollama.com/api" so
that appended paths like "/v1/chat/completions" resolve correctly; locate the
strings "cloud models via ollama.com" and the example BaseURL
"https://ollama.com" in docs/providers/supported-providers/ollama.mdx and update
them accordingly.
🧹 Nitpick comments (1)
docs/providers/supported-providers/ollama.mdx (1)

3-3: Consider mirroring “self-hosted” in the meta description for consistency.

This helps align the frontmatter with the overview bullets and improves discoverability of self-hosted usage.

✏️ Proposed tweak
-description: "Ollama API conversion guide - local and cloud inference, OpenAI-compatible format, streaming, tool calling, and embeddings"
+description: "Ollama API conversion guide - local, self-hosted, and cloud inference, OpenAI-compatible format, streaming, tool calling, and embeddings"

Comment on lines 9 to 13
Ollama is a **local-first, OpenAI-compatible inference engine** for running large language models on personal computers or servers.
It also supports cloud models via ollama.com.
Bifrost delegates to the OpenAI implementation while supporting Ollama's unique configuration requirements. Key characteristics:
- **Local-first deployment** - Run models locally or on private infrastructure
- **Cloud support** - Access models hosted remotely
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Ollama cloud API base URL authentication documentation

💡 Result:

  • Base URL (local, no auth): http://localhost:11434/api (or just http://localhost:11434 as the host, with /api/... paths). No authentication is required for local access. [1][2]
  • Base URL (Ollama Cloud / ollama.com): https://ollama.com/api (same endpoints, hosted). Direct cloud access requires authentication. [1][2][3]

Authentication options (cloud):

  1. CLI/device sign-in (automatic auth for cloud-backed requests via localhost): run ollama signin. After signing in, requests sent to your local Ollama API (e.g., http://localhost:11434/api/generate with a *-cloud model) are automatically authenticated upstream. [2][3]
  2. API keys (programmatic / direct to ollama.com):
    • Create an API key and set OLLAMA_API_KEY, then send it as a Bearer token:
      • Header: Authorization: Bearer $OLLAMA_API_KEY
      • Example endpoint: https://ollama.com/api/generate [2]
    • The Cloud docs also show using the Ollama client with host="https://ollama.com" plus the same Bearer header. [3]

Official docs pages: Base URL / intro [1], Authentication [2], Cloud API access [3].

Sources:
[1] https://docs.ollama.com/api/introduction
[2] https://docs.ollama.com/api/authentication
[3] https://docs.ollama.com/cloud


🏁 Script executed:

# Check repository structure for PR/stack context
git log --oneline -10 2>/dev/null | head -20

# Look for any PR-related files or documentation
find . -name "*.md" -o -name "*.txt" | grep -i -E "(pr|stack|change|release)" | head -20

# Search for Ollama-related files and their implementation
find . -type f -name "*ollama*" | head -20

Repository: maximhq/bifrost

Length of output: 576


🏁 Script executed:

# Check if there are references to Ollama in the codebase beyond the documentation
rg "ollama" -t tsx -t ts -t js -t py --ignore-case -l | head -20

Repository: maximhq/bifrost

Length of output: 86


🏁 Script executed:

# Read the Ollama provider implementation
cat -n ./core/providers/ollama/ollama.go

Repository: maximhq/bifrost

Length of output: 20720


🏁 Script executed:

# Check git history for stack context
git log --oneline -20 --all

# Check for any stack markers or related PR files
find . -name "*.md" -path "*/docs/*" | grep -i stack

# Look at the current documentation file
cat -n ./docs/providers/supported-providers/ollama.mdx

Repository: maximhq/bifrost

Length of output: 10165


🏁 Script executed:

# Search for any references to "cloud" or "ollama.com" in the provider implementation
rg "cloud|ollama\.com" ./core/providers/ollama/ -i -A 2 -B 2

Repository: maximhq/bifrost

Length of output: 41


Correct cloud endpoint examples to use the full API base path.

The documentation at lines 9 and 37 references ollama.com and https://ollama.com respectively, but the official Ollama Cloud API base URL is https://ollama.com/api. This is critical for users configuring the cloud endpoint:

  • Line 9: Change "cloud models via ollama.com" to clarify the actual API base: "via the Ollama Cloud service (https://ollama.com/api)"
  • Line 37: Update the BaseURL example from https://ollama.com to https://ollama.com/api to match the official API structure

Given the implementation appends paths like /v1/chat/completions directly to the BaseURL, users will need the complete API base path to reach the correct endpoint.

🤖 Prompt for AI Agents
In `@docs/providers/supported-providers/ollama.mdx` around lines 9 - 13, Update
the Ollama docs so the cloud endpoint references the full API base path: replace
the phrase "cloud models via ollama.com" with "via the Ollama Cloud service
(https://ollama.com/api)" and change the BaseURL example that currently uses
"https://ollama.com" to "https://ollama.com/api" so that appended paths like
"/v1/chat/completions" resolve correctly; locate the strings "cloud models via
ollama.com" and the example BaseURL "https://ollama.com" in
docs/providers/supported-providers/ollama.mdx and update them accordingly.

Comment on lines +37 to +38
For self-hosted models, ensure you have an Ollama instance running and configured with the correct BaseURL (e.g., `http://localhost:11434`).
To access cloud-based models, set BaseURL to your cloud endpoint (e.g., `https://ollama.com`) and provide an API key (see [Authentication](https://docs.ollama.com/api/authentication)).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

Ollama API cloud endpoint authentication BaseURL

💡 Result:

  • Local Ollama REST API BaseURL (no auth): http://localhost:11434/api (no authentication required for local access). [1][2]
  • Ollama Cloud REST API BaseURL (requires auth): https://ollama.com/api (authenticate with Authorization: Bearer <OLLAMA_API_KEY>). [1][2]
  • Local OpenAI-compatible BaseURL (OpenAI SDKs): http://localhost:11434/v1/ (the api_key parameter is required by some SDKs but is ignored by Ollama locally). [3]
  • Using cloud models via your local server: you still call http://localhost:11434/api/..., and after ollama signin Ollama automatically authenticates the cloud offload behind the scenes. [2]

Sources:
[1] Ollama API Introduction (Base URL) (docs.ollama.com)
[2] Ollama API Authentication (docs.ollama.com)
[3] Ollama OpenAI compatibility (docs.ollama.com)

Citations:


Correct the cloud BaseURL to include the API path and complete the local endpoint.

The cloud BaseURL should be https://ollama.com/api (not https://ollama.com, which points to the website), and the local example should specify http://localhost:11434/api for REST API calls. This explicitly distinguishes the correct endpoints. The authentication guidance is accurate: cloud requests require an API key (Bearer token), while local REST API access does not require authentication.

🤖 Prompt for AI Agents
In `@docs/providers/supported-providers/ollama.mdx` around lines 37 - 38, Update
the example BaseURL values to point to the API paths: change the cloud example
from `https://ollama.com` to `https://ollama.com/api` and change the local
example from `http://localhost:11434` to `http://localhost:11434/api`; keep the
note that cloud requests require an API key (Bearer token) and local REST API
access does not require authentication, and ensure the text references the
`BaseURL` setting so readers know which config to update.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
docs/providers/supported-providers/ollama.mdx (1)

115-119: Potential inconsistency with new cloud authentication guidance.

Line 116 states authentication is "optional" and describes it as an "optional Bearer token for protected instances." However, the new guidance at line 37 indicates that cloud access via ollama.com requires an API key.

Consider updating this note to clarify that:

  • Local access: No authentication required
  • Cloud access (ollama.com): API key authentication required
  • Self-hosted remote instances: Authentication optional based on deployment

This would align the two sections and prevent confusion for users configuring cloud access.

♻️ Duplicate comments (2)
docs/providers/supported-providers/ollama.mdx (2)

9-12: Cloud support documentation additions look good overall.

The additions accurately reflect the new capability to access Ollama cloud models. The "Cloud support" bullet and updated overview clearly communicate the dual local/cloud functionality.

Note: The cloud endpoint URL (ollama.com) was flagged in a previous review suggesting it should include the API path. Please ensure the previous feedback is addressed.


36-37: Self-hosted vs cloud guidance is helpful, but cloud BaseURL needs verification.

The differentiation between self-hosted and cloud access is clear and useful. The link to the official authentication documentation is a good addition.

However, as noted in previous review comments, the cloud BaseURL example (https://ollama.com) may need to be https://ollama.com/api based on official Ollama documentation. Please address the earlier feedback regarding the correct cloud endpoint path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants