Turn static documentation into live, agent-readable knowledge. Synchronize, Validate, and Enforce documentation quality using LLMs.
🇧🇷 Português | 🇺🇸 English
In modern software development, documentation rots faster than code. Developers hate writing it, and AI Agents hallucinate when reading outdated files.
DocSync is not just a linter. It is an Agentic Infrastructure Tool that treats your documentation as a living database. It sits between your codebase and your LLMs, ensuring that every Markdown file is accurate, up-to-date, and semantically structured for retrieval.
"Don't let your Agent read garbage. Feed it DocSync."
DocSync exposes your entire documentation knowledge base as a structured MCP Server.
- For Claude/Cursor: Connect DocSync to your assistant. It can now "read", "search", and "patch" your docs autonomously.
- No API Gluing: It just works with any MCP-compliant client.
Forget regex. DocSync uses LLMs (OpenAI, Anthropic, Gemini) to understand the content of your docs.
- Does this README match what
main.pyactually does? - Is the tone consistent across all 50 files?
- Are these installation steps actually executable?
Agnostic by design. Swap brains on the fly:
--provider openai(GPT-4o for deep logic)--provider anthropic(Claude 3.5 Sonnet for huge context windows)--provider gemini(Gemini Pro for speed)
pip install docsyncImprove your README instantly using your preferred LLM:
export OPENAI_API_KEY="sk-..."
docsync improve README.md --instruction "Make it sound enterprise-ready"Turn your folder into a knowledge server:
### Launch MCP Agent
Turn your folder into a knowledge server:
```bash
docsync serve --port 8000We use go-task to standardize development commands:
task install # Install dependencies
task test # Run tests
task lint # Check code quality
task clean # Remove cache
task serve # Start MCP ServerDocSync is built with Resilience and Extensibility in mind.
- Core: Python 3.9+ with strict typing (MyPy).
- Interface: Rich CLI for humans, JSON-RPC for Agents.
- Pattern: Adapter Pattern for LLMs (easy to add Llama 3 or Mistral).
graph LR
A[Human Dev] -->|CLI Commands| B[DocSync Core]
C[AI Agent / IDE] -->|MCP Protocol| B
B -->|Read/Write| D[Markdown Files]
B -->|Analyze| E[LLM Provider API]
| Persona | Benefit |
|---|---|
| CTOs | Ensure compliance and documentation standards across 100+ repos. |
| DevOps | Auto-generate changelogs and deployment guides during CI/CD. |
| Solo Devs | Have a "Documentation Co-pilot" that writes the boring stuff for you. |
We are building the standard for AI-Augmented Documentation. Contributions are welcome for:
- New MCP Tools
- Additional LLM Providers (Ollama, Groq)
- Integrations (Notion, Confluence)
- Performance Core: Research migration to Go (Golang) for instant CLI execution (< 10ms).
- Cloud Sync: Direct integration with Confluence/Notion APIs.
See CONTRIBUTING.md to join the swarm.