An AI-powered, multi-agent care planning system for Diabetes & Hypertension that relies strictly on official medical guidelines (RAG) to prevent hallucinations.
This software is for educational and research purposes only. It is NOT a diagnostic tool. All outputs must be verified by qualified healthcare professionals.
Chronic Care AI is a Multi-Agent Retrieval-Augmented Generation (RAG) pipeline that answers only from ingested, authoritative PDF guidelines (e.g., ADA, Hypertension guidelines). If the system cannot find supporting evidence in the ingested sources, it refuses to respond.
Core components:
- Supervisor agent (LangGraph) orchestrates worker agents.
- Trends Agent: analyzes time-series patient logs (BP, glucose, weight).
- Alerts Agent: flags guideline-based risks.
- Care Planner: generates daily care plans with inline citations.
- RAG Engine: ChromaDB vector store, dense retrieval, reranking & citation enforcement.
- Production features: Redis caching, reranker, Groq inference acceleration.
- FastAPI backend exposes REST/Graph endpoints.
- Streamlit (or similar) front-end for clinicians/patients.
- LangGraph manages multi-agent workflows.
- ChromaDB (local) or hosted vector DB stores passages & embeddings.
- Redis for caching retrieved contexts and rerank results.
- Groq (or other accelerator) for fast LLM inference.
- Ingestion pipeline converts PDFs → passages → embeddings → vector DB.
Mermaid overview:
graph TD
User --> API[FastAPI]
API --> Supervisor[LangGraph Supervisor]
Supervisor --> Trend[Trends Agent]
Supervisor --> Alert[Alerts Agent]
Supervisor --> Plan[Care Planner]
Plan & Alert -->|query| VectorDB[(ChromaDB)]
VectorDB --> PDF[Guideline PDFs]
Supervisor --> Cache[(Redis)]
LLM -->|infer| Groq[Groq]
Prereqs:
- Python 3.10+
- Docker & Docker Compose (optional but recommended)
- Redis (local or via Docker)
- At least one GPU or access to Groq endpoint if using Groq for inference (optional)
-
Clone
git clone https://github.com/yourusername/Chronic-Disease-Care-Planner.git cd Chronic-Disease-Care-Planner -
Python environment
python -m venv .venv source .venv/bin/activate # Windows: .venv\Scripts\activate pip install -r requirements.txt
-
Start supporting services (example using Docker)
docker run -d --name redis -p 6379:6379 redis:7 # optional: run a local Chroma/DB service if using a separate server -
Configure environment
- Copy
.env.example→.envand set:- REDIS_URL=redis://localhost:6379
- CHROMA_DIR=./data/chroma
- GROQ_API_KEY=your_groq_key (if applicable)
- RERANKER_CONF (path or params)
- CACHE_TTL (seconds)
- LANGCHAIN or LANGGRAPH config entries (as required)
- Note: exact env keys depend on your code; check app/config.
- Copy
-
Ingest guideline PDFs
- Place PDFs under
data/guidelines/ - Run ingestion (example):
python scripts/ingest.py --source data/guidelines --store chroma --chunk-size 500
- This will create passages, compute embeddings, store vectors into ChromaDB.
- Place PDFs under
-
Run services
- FastAPI:
uvicorn app.main:app --reload --port 8000
- Streamlit UI:
streamlit run app/ui.py
- FastAPI:
-
Example API usage
- POST patient data to /api/v1/assess and receive a cited, guideline-anchored care plan.
- Recommended: Docker Compose for local production-like stacks (FastAPI + Redis + Chroma + Nginx).
- Provide a
docker-compose.ymlthat starts:- web (FastAPI)
- ui (Streamlit or static)
- redis
- chroma (if containerized)
- ingress (nginx)
Basic steps:
docker compose up --buildKubernetes:
- Provide manifests or Helm charts for scaling (use HorizontalPodAutoscaler for workers), configure PersistentVolumes for Chroma and Redis persistence, and set Secrets for API keys.
- Citation enforcement: final outputs must include source IDs + page numbers extracted at ingestion. If no supporting passage above confidence threshold, agent should return "No guideline evidence found."
- Reranking: Use a cross-encoder or lightweight reranker to reorder top-k dense results before final prompting.
- Caching: Cache top retrievals and rerank results in Redis with TTL to reduce latency.
- Embedding drift: Re-ingest documents whenever guidelines update; version your vector store.
- Privacy: PHI must be handled with encryption at rest and in transit. Avoid sending raw PHI to third-party APIs.
- REDIS_URL
- CHROMA_DIR
- GROQ_API_KEY
- INFERENCE_PROVIDER (groq|openai|local)
- RERANKER_MODEL
- CACHE_TTL
- LOG_LEVEL
(Adapt based on actual code — check app/config.py)
- Unit tests for ingestion, retrieval, and agents.
- Integration tests for RAG pipeline with a small guideline fixture.
- CI pipeline should run linting, tests, and build a Docker image.
- Add e2e tests that verify "no-answer" behavior when no evidence exists.
- Low recall: increase chunk overlap and embed dim or tune retriever params.
- Hallucinations: tighten reranker threshold and reduce prompt-completion temperature; enforce citation check before returning.
- Slow inference: enable Groq or batch requests; cache frequently asked queries.
- Fork repo
- Branch:
feat/your-feature - Add tests and documentation
- Open PR with description and testing notes
MIT
For support or questions, open an issue or contact sanhariharan.7@gmail.com.