feat: add OpenAI API support with multi-provider architecture#1
Open
keyliass wants to merge 3 commits intorichstokes:mainfrom
Open
feat: add OpenAI API support with multi-provider architecture#1keyliass wants to merge 3 commits intorichstokes:mainfrom
keyliass wants to merge 3 commits intorichstokes:mainfrom
Conversation
- Add OpenAI client implementation with streaming support - Introduce --client_a and --client_b arguments for provider selection - Support mixed conversations (Ollama + OpenAI) - Add environment configuration with .env file support - Implement auto-detection of default models per provider - Add openai and python-dotenv dependencies - Update README with new usage examples and configuration guide - Maintain backward compatibility with existing Ollama functionality
… conversation history
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Add OpenAI API support
Hey! I've been playing around with InfiniChat and thought it would be cool to add OpenAI support alongside the existing Ollama functionality.
What's new
I added the ability to use OpenAI models (like GPT-4o) in addition to local Ollama models. The fun part is you can now mix and match - have a conversation between a local model and OpenAI, or use two different OpenAI models talking to each other.
How it works
Added two new flags:
--client_aand--client_bto choose betweenollamaoropenaiSome examples:
Setup
Just need to add your OpenAI API key to a .env file (included an example). Everything else works the same.
Technical stuff
I've been testing it with debates, creative writing prompts, and just random conversations.
Let me know if you'd like me to change anything or if you have questions about the implementation.
(PS: This is my first ever pull request, so don't hesitate to tell me if I've done anything wrong.)