Python-based AI agent that uses the Model Context Protocol (MCP) to connect Google Gemini with external tools.
- Fetches live weather data from OpenWeatherMap
- Loads and processes a local delivery log resource
- Built with LangGraph for orchestration and FastMCP for server tooling
- Weather Tool: Real-time weather for any city via OpenWeatherMap
- Resource Tool: Load/query a local
delivery_log.txtfile - Interactive Chat: Natural conversation; the agent auto-chooses tools
- MCP Protocol: Clean separation between AI logic and external tools
- Python 3.10+
- MCP — standard for AI tool integration
- FastMCP — quick MCP server creation
- LangGraph — agent state management
- LangChain Google GenAI — Gemini LLM
- OpenWeatherMap API — weather provider
- python-dotenv — env management
.
├── weather_client.py # AI agent (LangGraph + Gemini + MCP client)
├── weather_server.py # MCP server (weather + resource tools)
├── delivery_log.txt # Example resource file
├── requirements.txt # Python dependencies
├── example.env # Sample environment configuration
├── .gitignore # Git ignore rules
├── LICENSE # Project license (MIT)
└── README.md # Project documentation
- Python 3.10+ installed
- API keys
- Clone the repository
git clone https://github.com/ArslanJajja1/mcp-weather-resource-agent.git
cd single-server-mcp- Create and activate a virtual environment
python -m venv venv- Windows (PowerShell):
venv\Scripts\Activate.ps1- Windows (CMD):
venv\Scripts\activate.bat- macOS/Linux:
source venv/bin/activate- Install dependencies
pip install -r requirements.txt- Configure environment variables
Copy example.env to .env and fill in your keys:
OPENWEATHERMAP_API_KEY=your_openweathermap_api_key_here
GOOGLE_GEMINI_API_KEY=your_google_gemini_api_key_hereRun the client (it auto-starts the MCP server over stdio):
python weather_client.py-
Ask for weather
- You: What's the weather in London?
- AI: The current weather in London is scattered clouds, 18°C, feels like 16°C, humidity 72%, wind speed 3.5 m/s.
-
List available resources
- You:
/resources - Lists
file://delivery_logwith description
- You:
-
Load a resource
- You:
/resource file://delivery_log - Loads the file and optionally performs your requested action
- You:
- /resources: List all available resources from the MCP server
- /resource : Load a specific resource into the conversation
- exit / quit / q: Exit the program
weather_client.pylaunchesweather_server.pyvia MCP (stdio transport)- The client uses LangGraph to manage conversation flow and memory
- Gemini (
gemini-2.0-flash) decides whether to answer or call a tool - The MCP server executes tools and returns results
- The client formats and returns a final answer
- get_weather(location: str) — Returns current weather for a location using OpenWeatherMap
- Resource
file://delivery_log— Returns the contents ofdelivery_log.txt
- Do not commit your
.envfile;.gitignoreexcludes it - Keys are read via
python-dotenvat runtime
- Add more MCP tools in
weather_server.pyusing@mcp.tool()or@mcp.resource() - Keep dependencies in
requirements.txt
This project is licensed under the MIT License. See LICENSE for details.