Skip to content

atulmkamble/zero_to_chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local AI Chatbot with Llama 3.2 3B

A fully functional chatbot that runs entirely on your local machine using Ollama and Streamlit.

✨ Features

  • 🔒 Complete Privacy: All processing happens on your local machine
  • 💰 Zero Cost: No API subscriptions or usage fees
  • Fast Experimentation: Quick responses with lightweight models
  • 🎯 Simple Setup: Get started in minutes
  • 🔧 Extensible: Easy to customize and extend

🎬 Demo

Below is a quick, fully‑local demo of the chatbot running in Streamlit:

Demo of the local AI chatbot

📋 Prerequisites

  • Python 3.10 or higher
  • Ollama installed on your system

🚀 Installation

1. Install Ollama

Download and install Ollama from the official website:

https://ollama.com/download

Verify installation:

ollama --version

2. Pull the Llama 3.2 Model

ollama pull llama3.2:3b

3. Clone This Repository

git clone https://github.com/atulmkamble/zero_to_chatbot
cd zero_to_chatbot

4. Install Python Dependencies

pip install -r requirements.txt

5. Configure Environment Variables

Create a .env file in the project root:

OLLAMA_URL="http://localhost:11434/api/generate"

📦 Project Structure

.
├── main.py           # Simple CLI chatbot
├── app.py            # Streamlit web interface
├── .env              # Environment variables
├── requirements.txt  # Python dependencies
└── README.md         # This file

🎮 Usage

Run the CLI Chatbot

python main.py

This will send a test prompt to the local model and display the response in your terminal.

Run the Web Interface

streamlit run app.py

Open your browser and navigate to:

🔧 How It Works

  1. Ollama runs the Llama 3.2 3B model locally on your machine
  2. The Python scripts communicate with Ollama's API via HTTP requests
  3. Streamlit provides an interactive web interface for the chatbot
  4. All conversations stay on your local machine - no data leaves your system

🛠️ Customization

Change the Model

To use a different model, first pull it:

ollama pull <model-name>

Then update the model name in your code:

"model": "<model-name>"

📚 Learn More

Read the full tutorial on my blog: Zero to Chatbot: Build a Local AI Assistant

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages