Welcome to the Humanized Chatbot project! This chatbot is designed to replace a human presence on messaging platforms like Telegram and WhatsApp. It utilizes the Lang-Chain framework and OpenAI to create a human-like conversational experience for users. The chatbot learns from previous conversations and builds a knowledge graph memory to enhance its responses over time.
-
Humanized Conversations: The chatbot is trained to have human-like conversations, making interactions feel natural and engaging.
-
Pattern Recognition: By analyzing previous conversations, the chatbot identifies patterns and learns from them to improve its responses.
-
Knowledge Graph Memory: The chatbot maintains a knowledge graph memory that allows it to retain information and provide more contextually relevant answers.
-
Integration with Flask: The project utilizes Flask, a web framework in Python, for easy integration and deployment of the chatbot.
-
User-friendly Admin Interface: You can access the chatbot's admin webpage through
/adminon the Flask server. Here, you can create embeddings without the need for backend development knowledge. -
Dynamic Embedding: The latest embedding generated through the admin interface will be automatically incorporated into the chatbot, ensuring it stays up-to-date with the most recent information.
-
Chainlet for Response Integration: To streamline the response integration process, the project introduces a concept called "chainlet," which shares similarities with a streamlet.
Follow these steps to set up and run the Humanized Chatbot on your local machine:
-
Make sure you have Python installed (version 3.9 or higher).
-
Install Flask and other required dependencies:
pip install -r requirements.txt
# Add any other necessary packages here- Clone the repository to your local machine:
git clone https://github.com/your_username/your_project.git
cd your_project- Run the Flask application:
python app.py- Access the chatbot's admin webpage:
Open your web browser and go to http://localhost:5000/admin. Here, you can create new embeddings to improve the chatbot's responses.
Once the Flask server runs,app.py you can create your own embedding, then you can run the main.py by running chainlit run main.py -w the chatbot is ready to be used on your desired messaging platform.
To contribute to the project or make custom modifications, follow these guidelines:
-
Study the Lang-Chain framework and OpenAI documentation to understand the core of the chatbot's functionality.
-
Explore the Flask application in
app.pyto understand how the integration works and how to manage embeddings via the admin webpage. -
For any additional changes, create a new branch and commit your changes there:
git checkout -b feature/your_feature_name- Push the branch to the remote repository:
git push origin feature/your_feature_name- Open a pull request and describe the changes you've made. Your contributions are valuable to the project!
This project is licensed under the MIT License. See the LICENSE file for details.
- OpenAI - For providing powerful language processing capabilities.
- Flask - For the web application integration.
- Lang-Chain Framework - For facilitating pattern recognition and knowledge graph memory.
- chainlit - Inspiration for the chainlet concept.
- Integration of social media
- More advanced concepts of prompts
- UI/UX for the Application in a more detailed way
- docker
- CI/CD
If you have any questions or suggestions regarding the project, feel free to contact us at:
- Email: vanamayaswanth@gmail.com
Thank you for using our Humanized Chatbot! We hope it enriches your messaging experience and provides helpful and engaging conversations. Happy chatting!