Skip to content

feat: Add Forge LLM provider support#299

Open
Yiiii0 wants to merge 1 commit intosouzatharsis:mainfrom
Yiiii0:feature/add-forge-provider
Open

feat: Add Forge LLM provider support#299
Yiiii0 wants to merge 1 commit intosouzatharsis:mainfrom
Yiiii0:feature/add-forge-provider

Conversation

@Yiiii0
Copy link

@Yiiii0 Yiiii0 commented Feb 16, 2026

Changes

  • Added Forge detection in LLMBackend to route through litellm with openai/ prefix and api_base, added usage docs
  • Environment variable: FORGE_API_KEY
  • Base URL: https://api.forge.tensorblock.co/v1
  • Model format: Provider/model-name (e.g., OpenAI/gpt-4o)
  • Non-breaking: purely additive, existing providers are untouched

About Forge

Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.

Motivation

We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with Podcastfy. This integration aims to bridge that gap.

Key Benefits

  • Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty
  • Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately
  • Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK)

References

## Changes

- Added Forge detection in LLMBackend to route through litellm with openai/ prefix and api_base, added usage docs
- Environment variable: FORGE_API_KEY
- Base URL: https://api.forge.tensorblock.co/v1
- Model format: Provider/model-name (e.g., OpenAI/gpt-4o)
- Non-breaking: purely additive, existing providers are untouched

## About Forge

Forge (https://github.com/TensorBlock/forge) is an open-source middleware that routes inference across 40+ upstream providers (including OpenAI, Anthropic, Gemini, DeepSeek, and OpenRouter). It is OpenAI API compatible — works with the standard OpenAI SDK by changing base_url and api_key.

## Motivation

We have seen growing interest from users who standardize on Forge for their model management and want to use it natively with Podcastfy. This integration aims to bridge that gap.

## Key Benefits

- Self-Hosted & Privacy-First: Forge is open-source and designed to be self-hosted, critical for users who require data sovereignty
- Future-Proofing: acts as a decoupling layer — instead of maintaining individual adapters for every new provider, Forge users can access them immediately
- Compatibility: supports established aggregators (like OpenRouter) as well as direct provider connections (BYOK)

## References

- Repo: https://github.com/TensorBlock/forge
- Docs: https://www.tensorblock.co/api-docs/overview
- Main Page: https://www.tensorblock.co/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments