Skip to content

Sleek Streamlit chat app for Google Gemini (Files API). Dark, gradient UI with model picker, usage dialog, file/image/audio/PDF attach & preview, chat history, image persistence, robust error handling, and token usage tracking. Supports streaming replies and modular backend via google-genai.

License

Notifications You must be signed in to change notification settings

brej-29/aurora-chat-streamlit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

16 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🌌 Aurora β€” Gemini Chat (Streamlit)

Sleek dark-mode AI chat with file/image/audio/PDF attach, chat history, streaming replies, token usage, and robust error handling β€” powered by Streamlit & Google Gemini (Files API)



Built with the tools and technologies:

Python | Streamlit | Google Gemini (google-genai) | Files API | python-dotenv

Table of Contents


Overview

Aurora is a professional, dark-mode AI chat interface for Google Gemini with a polished gradient UI, sticky composer, and a modular backend. It supports image/audio/PDF uploads via the Files API, shows previews before send, persists uploaded files across turns, streams model output token-by-token, tracks usage, and handles errors gracefully (e.g., rate limits, temporary outages). The UI is tuned for productivity: model picker, usage dialog, suggestion chips, and a centered greeting on first run.


Project Highlights

  • Modern UI/UX: fixed bottom composer, gradient send button, attach modal with staged previews, centered hero + 2Γ—2 suggestions.
  • Files API only: images, audio, PDFs uploaded once; references reused across the session for follow-up questions.
  • Streaming replies: incremental rendering with a visible β€œThinking…” placeholder.
  • Usage metrics: input/output/reasoning tokens per turn; running totals in a modal.
  • Resilient backend: modular backend/genai_backend.py with graceful fallbacks and robust retries for transient server errors.
  • Model switcher: choose between gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-preview-09-2025, fallback gemini-2.0-flash.

Features

  • Dark, indigo-magenta-gold gradient theme; compact header with model select (β‰ˆ1/8 width) and Usage button.
  • Sticky composer with οΌ‹ button β†’ staging modal β†’ Attach confirmation; tiny thumbnail chips above composer.
  • Chat history with user/assistant bubbles; user messages display attached files inline.
  • Image persistence: ask follow-up questions about previously attached files (without re-uploading).
  • β€œThinking…” indicator placed right after the user’s latest message.
  • Streaming output with smooth autoscroll behavior during the stream.
  • Friendly error surfaces (429 suggest model switch; 503 explain temporary unavailability; 400 guidance to simplify).
  • Token usage (prompt/response/reasoning) aggregated across session.

Getting Started

Follow these steps to run the project locally.

Project Structure

aurora-chat-streamlit/
β”œβ”€ app.py                          # Streamlit UI & chat orchestration
β”œβ”€ backend/
β”‚  └─ genai_backend.py             # google-genai client, Files API upload, generate/stream helpers
β”œβ”€ frontend/
β”‚  └─ scroll.py                    # (Optional helper) one-shot scroll utilities for UX polish
β”œβ”€ .env                            # contains GEMINI_API_KEY (not committed)
β”œβ”€ requirements.txt
β”œβ”€ LICENSE
└─ README.md

Prerequisites

  • Python 3.9+
  • A Google AI Studio API key with access to Gemini models
  • Internet connectivity to call the API

Installation

  1. Create and activate a virtual environment (recommended).

    python -m venv .venv
    # Windows:
    .venv\Scripts\activate
    # macOS/Linux:
    source .venv/bin/activate
    
  2. Install dependencies.

    pip install -r requirements.txt
    

Configuration

Create a .env file at the project root:

GEMINI_API_KEY=your_api_key_here

app.py loads this via python-dotenv. Environment variables also work.

Usage

Run the app:

streamlit run app.py

Workflow inside the app:

  1. Pick a model from the header dropdown.
  2. (Optional) Click Usage to see token totals (updates after model calls).
  3. Start typing in the composer or click a suggestion chip.
  4. Click the οΌ‹ button to open the attach modal β†’ upload files β†’ click Attach.
  5. Send your message. You’ll see your message bubble (with files) followed by a Thinking… placeholder and streamed output.
  6. Ask follow-ups without re-uploading β€” the Files API references persist for the session.

Roadmap

βœ… Completed

  • Dark gradient UI with sticky composer and compact header
  • 2Γ—2 suggestion chips and centered greeting on first load
  • Files API integration; staged previews and inline message attachments
  • Image/audio/PDF support; image persistence across turns
  • Streaming responses with β€œThinking…” indicator
  • Error handling with user-friendly guidance (429/503/400)
  • Token usage: prompt/response/reasoning + session totals
  • Modular backend (genai_backend.py) and frontend utility (frontend/scroll.py)
  • Model picker: 2.5 Pro, 2.5 Flash, 2.5 Flash Preview, 2.0 Flash (fallback)

⏭️ Pending / Nice-to-Have

  • In-app model capability hints (vision/audio limits, file caps)
  • Chat export (markdown/HTML) and β€œShare link” (optional)
  • Theming controls (font size/compact mode/high-contrast)
  • Advanced file library view (rename/remove/inspect metadata)
  • Settings drawer (system prompt, temperature, safety toggles)
  • Unit tests and linting (pytest/ruff)
  • Example deployments (Streamlit Community Cloud / Docker)
  • Keyboard shortcuts cheat-sheet and accessibility polish (ARIA)
  • Basic analytics (per-turn latency, success/error rates)

License

MIT β€” see LICENSE for details.


Contact

Questions, feedback, or feature requests? Open an issue or reach out on LinkedIn.


Contributing

Contributions are very welcome! If you’d like to improve the UX, add tests, wire up deployments, or extend model features, please:

  1. Fork the repo and create a branch,
  2. Keep changes focused and documented,
  3. Open a PR with a clear description and screenshots where relevant.

If you use Aurora in your own project, I’d love to hear about it β€” please share a link! πŸŽ‰


Screenshots

image image image image image

About

Sleek Streamlit chat app for Google Gemini (Files API). Dark, gradient UI with model picker, usage dialog, file/image/audio/PDF attach & preview, chat history, image persistence, robust error handling, and token usage tracking. Supports streaming replies and modular backend via google-genai.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages