Examples - themanojdesai/python-a2a GitHub Wiki

Examples

Python A2A includes a comprehensive set of examples that demonstrate different aspects of the library. These examples are designed to progress from simple to complex, helping you understand the capabilities of the library.

Getting Started Examples

These examples provide a gentle introduction to the A2A protocol:

Hello A2A

The simplest example of creating and using an A2A agent:

# examples/getting_started/hello_a2a.py
from python_a2a import A2AServer, HTTPClient

# Create a simple A2A server
class HelloAgent(A2AServer):
    def handle_task(self, task):
        return {"output": f"Hello, {task.input}!"}

# Run it locally and send a message
agent = HelloAgent()
client = HTTPClient("http://localhost:5000")
response = client.send_message("World")
print(response.content)  # "Hello, World!"

Simple Client and Server

A more complete example showing client-server interaction:

# examples/getting_started/simple_server.py
from python_a2a import A2AServer, agent, skill, run_server

@agent(
    name="Echo Agent",
    description="Simple agent that echoes messages"
)
class EchoAgent(A2AServer):
    
    @skill(
        name="Echo Message",
        description="Echo back the input message"
    )
    def echo(self, message):
        return f"You said: {message}"
    
    def handle_task(self, task):
        return {"output": self.echo(task.input)}

# Run the server
agent = EchoAgent()
run_server(agent, port=5000)
# examples/getting_started/simple_client.py
from python_a2a import HTTPClient

client = HTTPClient("http://localhost:5000")
response = client.send_message("Hello, Agent!")
print(response.content)  # "You said: Hello, Agent!"

Function Calling

Example showing how to use function calling with A2A agents:

# examples/getting_started/function_calling.py
from python_a2a import A2AServer, agent, skill, run_server, HTTPClient

@agent(
    name="Math Agent",
    description="Performs math operations"
)
class MathAgent(A2AServer):
    
    @skill(
        name="Add",
        description="Add two numbers",
        parameters={
            "a": {"type": "number", "description": "First number"},
            "b": {"type": "number", "description": "Second number"}
        }
    )
    def add(self, a, b):
        return a + b
    
    @skill(
        name="Multiply",
        description="Multiply two numbers",
        parameters={
            "a": {"type": "number", "description": "First number"},
            "b": {"type": "number", "description": "Second number"}
        }
    )
    def multiply(self, a, b):
        return a * b
    
    def handle_task(self, task):
        # Parse function call
        function_call = task.function_call
        if function_call and function_call.name == "add":
            args = function_call.arguments
            result = self.add(args["a"], args["b"])
            return {"output": result}
        elif function_call and function_call.name == "multiply":
            args = function_call.arguments
            result = self.multiply(args["a"], args["b"])
            return {"output": result}
        else:
            return {"output": "Please call a function like add or multiply"}

# Example usage
agent = MathAgent()
client = HTTPClient(agent)

# Call the add function
result = client.call_function("add", {"a": 5, "b": 3})
print(result)  # 8

# Call the multiply function
result = client.call_function("multiply", {"a": 5, "b": 3})
print(result)  # 15

MCP Integration Examples

These examples demonstrate the Model Context Protocol integration:

MCP Agent

Creating an agent that uses MCP tools:

# examples/mcp/mcp_agent.py
from python_a2a.mcp import FastMCP, MCPAgent
from python_a2a import run_server

# Create an MCP server with tools
calculator_mcp = FastMCP(name="Calculator MCP")

@calculator_mcp.tool()
def add(a: float, b: float) -> float:
    """Add two numbers together."""
    return a + b

@calculator_mcp.tool()
def subtract(a: float, b: float) -> float:
    """Subtract b from a."""
    return a - b

# Create an agent that uses the MCP tools
agent = MCPAgent(
    name="Math Assistant",
    description="Assistant that can perform calculations",
    mcp_server_url="http://localhost:8000"
)

# Run both servers
if __name__ == "__main__":
    # Run MCP server in a separate thread
    import threading
    threading.Thread(
        target=run_server,
        args=(calculator_mcp,),
        kwargs={"port": 8000},
        daemon=True
    ).start()
    
    # Run the agent
    run_server(agent, port=5000)

MCP Tools

Example of creating and using custom MCP tools:

# examples/mcp/mcp_tools.py
from python_a2a.mcp import FastMCP, MCPAgent
from python_a2a import run_server, HTTPClient
import requests

# Create an MCP server with a web search tool
web_tools = FastMCP(name="Web Tools")

@web_tools.tool()
def search_web(query: str) -> list:
    """Search the web for information."""
    # Simulated web search
    return [
        {"title": f"Result for {query} 1", "snippet": f"Information about {query}..."},
        {"title": f"Result for {query} 2", "snippet": f"More information about {query}..."}
    ]

@web_tools.tool()
def get_weather(location: str) -> dict:
    """Get weather information for a location."""
    # Simulated weather API
    return {
        "location": location,
        "temperature": 72,
        "condition": "Sunny",
        "humidity": 45
    }

# Example usage
if __name__ == "__main__":
    # Run the MCP server
    run_server(web_tools, port=8000)

LangChain Integration Examples

These examples show the bidirectional integration with LangChain:

LangChain to A2A

Converting LangChain components to A2A:

# examples/langchain/langchain_to_a2a.py
from langchain.llms import OpenAI
from langchain.agents import initialize_agent, Tool
from langchain.chains import LLMMathChain

from python_a2a.langchain import to_a2a_server
from python_a2a import run_server, HTTPClient

# Create a LangChain agent
llm = OpenAI(temperature=0)
math_chain = LLMMathChain(llm=llm)

tools = [
    Tool(
        name="Calculator",
        func=math_chain.run,
        description="Useful for calculations"
    )
]

agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)

# Convert to A2A server
a2a_server = to_a2a_server(agent)

# Run the server
if __name__ == "__main__":
    run_server(a2a_server, port=5000)

A2A to LangChain

Using A2A agents in LangChain:

# examples/langchain/a2a_to_langchain.py
from python_a2a import A2AServer, agent, skill
from python_a2a.langchain import to_langchain_agent

from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI

# Create a simple A2A agent
@agent(
    name="Weather Agent",
    description="Provides weather information"
)
class WeatherAgent(A2AServer):
    
    @skill(
        name="Get Weather",
        description="Get current weather for a location"
    )
    def get_weather(self, location):
        return f"It's sunny and 75°F in {location}"
    
    def handle_task(self, task):
        return {"output": self.get_weather(task.input)}

# Convert to LangChain
weather_agent = WeatherAgent()
langchain_weather = to_langchain_agent(weather_agent)

# Use in LangChain
llm = OpenAI(temperature=0)
tools = [
    Tool(
        name="Weather",
        func=langchain_weather,
        description="Get weather information for a location"
    )
]

agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)

# Run the agent
if __name__ == "__main__":
    result = agent.run("What's the weather like in New York?")
    print(result)

Streaming Examples

These examples demonstrate real-time response streaming:

Basic Streaming

Simple example of streaming responses:

# examples/streaming/basic_streaming.py
import asyncio
from python_a2a import A2AServer, agent, run_server, HTTPClient

@agent(
    name="Streaming Agent",
    description="Agent that streams responses"
)
class StreamingAgent(A2AServer):
    async def stream_response(self, message):
        words = message.split()
        for word in words:
            yield {"content": word + " "}
            await asyncio.sleep(0.2)  # Simulate thinking time

async def client_example():
    client = HTTPClient("http://localhost:5000")
    
    print("Streaming response:")
    async for chunk in client.stream_response("This is a streaming response example"):
        print(chunk.content, end="", flush=True)
    print("\nStreaming complete!")

if __name__ == "__main__":
    # Run server in a separate thread
    import threading
    agent = StreamingAgent()
    threading.Thread(
        target=run_server,
        args=(agent,),
        kwargs={"port": 5000},
        daemon=True
    ).start()
    
    # Wait for server to start
    import time
    time.sleep(1)
    
    # Run client
    asyncio.run(client_example())

Agent Networks Examples

These examples show how to build agent networks:

Agent Network

Building a network of specialized agents:

# examples/agent_network/agent_network.py
from python_a2a import AgentNetwork, HTTPClient

# Create an agent network
network = AgentNetwork(name="Travel Assistant Network")

# Add agents to the network
network.add("weather", "http://localhost:5001")
network.add("hotels", "http://localhost:5002")
network.add("flights", "http://localhost:5003")

# Create a client to the network
client = HTTPClient(network)

# Send a message to a specific agent
weather_response = client.send_message_to("weather", "New York")
print(f"Weather: {weather_response.content}")

# Send a message and let the network route it
response = client.send_message("Find me a hotel in New York")
print(f"Hotel: {response.content}")

Workflows Examples

These examples demonstrate the workflow engine:

Basic Workflow

Creating a simple agent workflow:

# examples/workflows/basic_workflow.py
from python_a2a import AgentNetwork, Flow

# Create an agent network
network = AgentNetwork(name="Research Network")
network.add("research", "http://localhost:5001")
network.add("summarizer", "http://localhost:5002")
network.add("fact_checker", "http://localhost:5003")

# Create a workflow
flow = Flow(agent_network=network)

# Define the workflow steps
flow.ask("research", "Research quantum computing applications")
flow.ask("summarizer", "Summarize the research findings")
flow.ask("fact_checker", "Verify the accuracy of the summary")

# Execute the workflow
result = flow.execute(variables={"topic": "quantum computing"})
print(result.final_output)

Running the Examples

To run any of these examples:

  1. Navigate to the examples directory
  2. Run the example with Python
# Example:
cd examples/getting_started
python hello_a2a.py

Many examples require running multiple scripts (server and client). For these, you'll need to use separate terminal windows or run the server in the background.

For more complex examples, see the full examples directory in the repository.