Skip to content

feat: Add support for OpenRouter as an LLM provider.#35

Open
elskito wants to merge 1 commit intovas3k:mainfrom
elskito:add-support-for-openrouter
Open

feat: Add support for OpenRouter as an LLM provider.#35
elskito wants to merge 1 commit intovas3k:mainfrom
elskito:add-support-for-openrouter

Conversation

@elskito
Copy link

@elskito elskito commented Jul 24, 2025

Summary

OpenRouter API support added to app

PR contains:

  • modified README
  • modified code for OpenRouter compatibility
  • modified config for OpenRouter compatibility

@elskito elskito changed the title Add support for OpenRouter as an LLM provider. feat: Add support for OpenRouter as an LLM provider. Jul 24, 2025
@elskito elskito force-pushed the add-support-for-openrouter branch from d0ed6f1 to eb963b3 Compare July 25, 2025 19:22
@vas3k
Copy link
Owner

vas3k commented Aug 2, 2025

I've never heard of such a platform, to be honest. It looks like an aggregator that, in theory, gives access to many LLMs at once, which is useful for sure. But I'm doubting the need to include it in the standard deployment. Only models that work good with images (webp, to be precise) are suitable for TaxHacker, so I would rather add proven LLMs to avoid confusing users and leaving them with a non-functional application.

For example, I would be happy if one of the DeepSeek and Ollama models were a separate option for self-hosted users

@adryserage
Copy link

Code Review Analysis 🔍

Thanks for adding OpenRouter support! This is a clean implementation. Here are some suggestions:

✅ What looks good

  • Clean integration following existing provider patterns
  • Proper environment variable setup in .env.example
  • Good use of the existing ChatOpenAI class with custom baseURL

💡 Suggestions for improvement

  1. Model name inconsistency: The default model in .env.example is openai/gpt-4.1-mini but in lib/llm-providers.ts it's openai/gpt-4.1-mini and in forms/settings.ts it's openai/gpt-4o-mini. Consider standardizing to openai/gpt-4o-mini which is the correct OpenAI model ID format.

  2. Consider adding model suggestions: Similar to other providers, you might want to add a suggestedModels array to help users know which models are available on OpenRouter:
    ```typescript
    suggestedModels: ["openai/gpt-4o-mini", "anthropic/claude-3-haiku", "google/gemini-flash-1.5"]
    ```

  3. Error handling: OpenRouter may return different error formats than OpenAI. Consider adding specific error handling for OpenRouter responses.

🔗 Related

Note: I've opened PR #58 and #59 which improve error messages for invalid models and add dynamic model fetching from provider APIs. These changes would complement your OpenRouter addition well!


Automated review by Aetheris

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants