Skip to content

An AI-powered MCP agent using Google Gemini and LangGraph to fetch real-time weather data and query local resources like delivery logs.

License

Notifications You must be signed in to change notification settings

ArslanJajja1/mcp-weather-resource-agent

Repository files navigation

🌦 Weather & Resource MCP Agent

Python-based AI agent that uses the Model Context Protocol (MCP) to connect Google Gemini with external tools.

  • Fetches live weather data from OpenWeatherMap
  • Loads and processes a local delivery log resource
  • Built with LangGraph for orchestration and FastMCP for server tooling

📌 Features

  • Weather Tool: Real-time weather for any city via OpenWeatherMap
  • Resource Tool: Load/query a local delivery_log.txt file
  • Interactive Chat: Natural conversation; the agent auto-chooses tools
  • MCP Protocol: Clean separation between AI logic and external tools

🛠 Tech Stack


📂 Project Structure

.
├── weather_client.py       # AI agent (LangGraph + Gemini + MCP client)
├── weather_server.py       # MCP server (weather + resource tools)
├── delivery_log.txt        # Example resource file
├── requirements.txt        # Python dependencies
├── example.env             # Sample environment configuration
├── .gitignore              # Git ignore rules
├── LICENSE                 # Project license (MIT)
└── README.md               # Project documentation

🔑 Prerequisites

  1. Python 3.10+ installed
  2. API keys

⚙️ Setup

  1. Clone the repository
git clone https://github.com/ArslanJajja1/mcp-weather-resource-agent.git
cd single-server-mcp
  1. Create and activate a virtual environment
python -m venv venv
  • Windows (PowerShell):
venv\Scripts\Activate.ps1
  • Windows (CMD):
venv\Scripts\activate.bat
  • macOS/Linux:
source venv/bin/activate
  1. Install dependencies
pip install -r requirements.txt
  1. Configure environment variables

Copy example.env to .env and fill in your keys:

OPENWEATHERMAP_API_KEY=your_openweathermap_api_key_here
GOOGLE_GEMINI_API_KEY=your_google_gemini_api_key_here

🚀 Run

Run the client (it auto-starts the MCP server over stdio):

python weather_client.py

💬 Usage

  • Ask for weather

    • You: What's the weather in London?
    • AI: The current weather in London is scattered clouds, 18°C, feels like 16°C, humidity 72%, wind speed 3.5 m/s.
  • List available resources

    • You: /resources
    • Lists file://delivery_log with description
  • Load a resource

    • You: /resource file://delivery_log
    • Loads the file and optionally performs your requested action

📜 Commands

  • /resources: List all available resources from the MCP server
  • /resource : Load a specific resource into the conversation
  • exit / quit / q: Exit the program

🧩 How It Works

  1. weather_client.py launches weather_server.py via MCP (stdio transport)
  2. The client uses LangGraph to manage conversation flow and memory
  3. Gemini (gemini-2.0-flash) decides whether to answer or call a tool
  4. The MCP server executes tools and returns results
  5. The client formats and returns a final answer

🧪 Tools Exposed by the MCP Server

  • get_weather(location: str) — Returns current weather for a location using OpenWeatherMap
  • Resource file://delivery_log — Returns the contents of delivery_log.txt

🔒 Security & Secrets

  • Do not commit your .env file; .gitignore excludes it
  • Keys are read via python-dotenv at runtime

🧰 Development

  • Add more MCP tools in weather_server.py using @mcp.tool() or @mcp.resource()
  • Keep dependencies in requirements.txt

📄 License

This project is licensed under the MIT License. See LICENSE for details.

About

An AI-powered MCP agent using Google Gemini and LangGraph to fetch real-time weather data and query local resources like delivery logs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages