A sample agent demonstrating A2A + ADK + MCP working together. It leverages the new Agent2Agent (A2A) Python SDK (a2a-sdk
) and v1.0.0+ of Google's Agent Development Kit (ADK), google-adk
.
Both were announced at Google I/O 2025.
The sample aims at laying out a foundation and showcasing the capabilities of A2A + ADK + MCP. It is a currency converter agent that can convert between different countries' currencies.
MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools. - Anthropic
The MCP server in this example exposes a tool get_exchange_rate
that can be used to get the exchange rate between two currencies such as USD and EUR. It leverages the Frankfurter API to get the currency exchange rate. Our agent uses an MCP client to invoke this tool when needed.
ADK is a flexible and modular framework for developing and deploying AI agents. While optimized for Gemini and the Google ecosystem, ADK is model-agnostic, deployment-agnostic, and is built for compatibility with other frameworks. - ADK
ADK (v1.0.0) is used as the orchestration framework for creating our currency agent in this sample. It handles the conversation with the user and invokes our MCP tool when needed.
Agent2Agent (A2A) protocol addresses a critical challenge in the AI landscape: enabling gen AI agents, built on diverse frameworks by different companies running on separate servers, to communicate and collaborate effectively - as agents, not just as tools. A2A aims to provide a common language for agents, fostering a more interconnected, powerful, and innovative AI ecosystem. - A2A
The new A2A Python SDK is used to create an A2A server that advertises and executes our ADK agent. We then run an A2A client that connects to our A2A server and invokes our ADK agent.
- Python 3.10+
- Git, for cloning the repository.
- Clone the repository:
git clone https://github.com/jackwotherspoon/currency-agent.git
cd currency-agent
- Install uv (used to manage dependencies):
# macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows (uncomment below line)
# powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Note
You may need to restart or open a new terminal after installing uv
.
- Configure environment variables (via
.env
file):
There are two different ways to call Gemini models:
- Calling the Gemini API directly using an API key created via Google AI Studio.
- Calling Gemini models through Vertex AI APIs on Google Cloud.
Tip
An API key from Google AI Studio is the quickest way to get started.
Existing Google Cloud users may want to use Vertex AI.
Gemini API Key
Get an API Key from Google AI Studio: https://aistudio.google.com/apikey
Create a .env
file by running the following (replace <your_api_key_here>
with your API key):
echo "GOOGLE_API_KEY=<your_api_key_here>" >> .env \
&& echo "GOOGLE_GENAI_USE_VERTEXAI=FALSE" >> .env
Vertex AI
To use Vertex AI, you will need to create a Google Cloud project and enable Vertex AI.
Authenticate and enable Vertex AI API:
gcloud auth login
# Replace <your_project_id> with your project ID
gcloud config set project <your_project_id>
gcloud services enable aiplatform.googleapis.com
Create a .env
file by running the following (replace <your_project_id>
with your project ID):
echo "GOOGLE_GENAI_USE_VERTEXAI=TRUE" >> .env \
&& echo "GOOGLE_CLOUD_PROJECT=<your_project_id>" >> .env \
&& echo "GOOGLE_CLOUD_LOCATION=us-central1" >> .env
Now you are ready for the fun to begin!
In a terminal, start the MCP Server (it starts on port 8080):
uv run mcp-server/server.py
In a separate terminal, start the A2A Server (it starts on port 10000):
uv run currency_agent
In a separate terminal, run the A2A Client to run some queries against our A2A server:
uv run currency_agent/test_client.py
Contributions are welcome! Please feel free to submit pull requests or open issues.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.