AI-Parrot is a powerful, async-first Python framework for building, extending, and orchestrating AI Agents and Chatbots. Built on top of navigator-api, it provides a unified interface for interacting with various LLM providers, managing tools, conducting agent-to-agent (A2A) communication, and serving agents via the Model Context Protocol (MCP).
Whether you need a simple chatbot, a complex multi-agent orchestration workflow, or a robust production-ready AI service, AI-Parrot exposes the primitives to build it efficiently.
- Unified Agent API: Simple interface (
Chatbot) to create agents with memory, tools, and RAG capabilities. - Tool Management: Easy-to-use decorators (
@tool) and class-based toolkits (AbstractToolkit) to give your agents capabilities. - Orchestration & Workflow:
AgentCrewfor managing multi-agent workflows (Sequential, Parallel, Flow, Loop). - Advanced Connectivity:
- A2A (Agent-to-Agent): Native protocol for agents to discover and talk to each other.
- MCP (Model Context Protocol): Expose your agents as MCP servers or consume external MCP servers.
- OpenAPI Integration: Consume any OpenAPI specification as a dynamic toolkit (
OpenAPIToolkit). - Scheduling: Built-in task scheduling for agents using the
@scheduledecorator. - Multi-Provider Support: Switch seamlessy between OpenAI, Anthropic, Google Gemini, Groq, and more.
- Integrations: Native support for exposing bots via Telegram, MS Teams, and Slack.
pip install ai-parrotFor specific provider support (e.g., Anthropic, Google):
pip install "ai-parrot[anthropic,google]"Create a simple weather chatbot in just a few lines of code:
import asyncio
from parrot.bots import Chatbot
from parrot.tools import tool
# 1. Define a tool
@tool
def get_weather(location: str) -> str:
"""Get the current weather for a location."""
return f"The weather in {location} is Sunny, 25Β°C"
async def main():
# 2. Create the Agent
bot = Chatbot(
name="WeatherBot",
llm="openai:gpt-4o", # Provider:Model
tools=[get_weather],
system_prompt="You are a helpful weather assistant."
)
# 3. Configure (loads tools, connects to memory)
await bot.configure()
# 4. Chat!
response = await bot.ask("What's the weather like in Madrid?")
print(response)
if __name__ == "__main__":
asyncio.run(main())AI-Parrot is designed with a modular architecture enabling agents to be both consumers and providers of tools and services.
graph TD
User["User / Client"] --> API["AgentTalk Handlers"]
API --> Bot["Chatbot / BaseBot"]
subgraph "Agent Core"
Bot --> Memory["Memory / Vector Store"]
Bot --> LLM["LLM Client (OpenAI/Anthropic/Etc)"]
Bot --> TM["Tool Manager"]
end
subgraph "Tools & Capabilities"
TM --> LocalTools["Local Tools (@tool)"]
TM --> Toolkits["Toolkits (OpenAPI/Custom)"]
TM --> MCPServer["External MCP Servers"]
end
subgraph "Connectivity"
Bot -.-> A2A["A2A Protocol (Client/Server)"]
Bot -.-> MCP["MCP Protocol (Server)"]
Bot -.-> Integrations["Telegram / MS Teams"]
end
subgraph "Orchestration"
Crew["AgentCrew"] --> Bot
Crew --> OtherBots["Other Agents"]
end
The Chatbot class is your main entry point. It handles conversation history, RAG (Retrieval-Augmented Generation), and tool execution loop.
bot = Chatbot(
name="MyAgent",
model="anthropic:claude-3-5-sonnet-20240620",
enable_memory=True
)The simplest way to create a tool. The docstring and type hints are automatically used to generate the schema for the LLM.
from parrot.tools import tool
@tool
def calculate_vat(amount: float, rate: float = 0.20) -> float:
"""Calculate VAT for a given amount."""
return amount * rateGroup related tools into a reusable class. All public async methods become tools.
from parrot.tools import AbstractToolkit
class MathToolkit(AbstractToolkit):
async def add(self, a: int, b: int) -> int:
"""Add two numbers."""
return a + b
async def multiply(self, a: int, b: int) -> int:
"""Multiply two numbers."""
return a * bDynamically generate tools from any OpenAPI/Swagger specification.
from parrot.tools.openapi_toolkit import OpenAPIToolkit
petstore = OpenAPIToolkit(
spec="https://petstore.swagger.io/v2/swagger.json",
service="petstore"
)
# Now your agent can call petstore_get_pet_by_id, etc.
bot = Chatbot(name="PetBot", tools=petstore.get_tools())orchestrate multiple agents to solve complex tasks using AgentCrew.
Supported Modes:
- Sequential: Agents run one after another, passing context.
- Parallel: Independent tasks run concurrently.
- Flow: DAG-based execution defined by dependencies.
- Loop: Iterative execution until a condition is met.
from parrot.bots.orchestration import AgentCrew
crew = AgentCrew(
name="ResearchTeam",
agents=[researcher_agent, writer_agent]
)
# Define a Flow
# Writer waits for Researcher to finish
crew.task_flow(researcher_agent, writer_agent)
await crew.run_flow("Research the latest advancements in Quantum Computing")Give your agents agency to run tasks in the background.
from parrot.scheduler import schedule, ScheduleType
class DailyBot(Chatbot):
@schedule(schedule_type=ScheduleType.DAILY, hour=9, minute=0)
async def morning_briefing(self):
news = await self.ask("Summarize today's top tech news")
await self.send_notification(news)Agents can discover and talk to each other using the A2A protocol.
Expose an Agent:
# In your server setup (aiohttp)
from parrot.a2a import A2AServer
a2a = A2AServer(my_agent)
a2a.setup(app, url="https://my-agent.com")Consume an Agent:
from parrot.a2a import A2AClient
async with A2AClient("https://remote-agent.com") as client:
response = await client.send_message("Hello from another agent!")AI-Parrot has first-class support for MCP.
Consume MCP Servers: Give your agent access to filesystem, git, or any other MCP server.
# In Chatbot config
mcp_servers = [
MCPServerConfig(
name="filesystem",
command="npx",
args=["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
)
]
await bot.setup_mcp_servers(mcp_servers)Expose Agent as MCP Server: Allow Claude Desktop or other MCP clients to use your agent as a tool.
# (Configuration details in documentation)Expose your bots natively to chat platforms defined in your parrot.conf:
- Telegram
- Microsoft Teams
- Slack
AI-Parrot supports a wide range of LLM providers via parrot.clients:
- OpenAI (
openai) - Anthropic (
anthropic,claude) - Google Gemini (
google) - Groq (
groq) - X.AI (
grok) - HuggingFace (
hf) - Ollama/Local (via OpenAI compatible endpoint)
- Issues: GitHub Tracker
- Discussion: GitHub Discussions
- Contribution: Pull requests are welcome! Please read
CONTRIBUTING.md.
Built with β€οΈ by the Navigator Team