Integrating Agents with MCP: MCP For LangChain Agents
Integrating An Existing MCP With LangChain
The plan for this section is straightforward: we take the two tools from the previous chapter and move them into a FastMCP server. The agent no longer defines tools itself — it asks MultiServerMCPClient to fetch them from the running server, then passes them to create_agent exactly as before. Everything else — memory, summarization, human-in-the-loop — stays the same.
We will end up with two files:
server.pyhosts the tools as MCP tools.agent.pyconnects to that server and runs the conversation loop.
Step 1: Create the Project
mkdir -p $HOME/workspace/langchain/langchain_agent_with_mcp
cd $HOME/workspace/langchain/langchain_agent_with_mcp
uv init --bare --python 3.12
Step 2: Install Dependencies
Two packages are new compared to the previous chapter:
fastmcpis the framework we use to write the MCP server.langchain-mcp-adaptersis the official bridge that converts MCP tool definitions into LangChainBaseToolinstances the agent can call.
uv add \
"fastmcp==3.0.2" \
"httpx==0.28.1" \
"langchain==1.2.10" \
"langchain-mcp-adapters==0.2.1" \
"langchain-openai==1.1.10" \
"langgraph==1.0.9" \
"python-dotenv==1.2.1"
Step 3: Add Your API Key
cat > .env <
OPENAI_API_KEY=your_openai_api_key_here
EOF
Step 4: Write the MCP Server
Create server.py. The tool logic is identical to the previous chapter — the only difference is the decorator. Instead of LangChain's @tool, we use FastMCP's @mcp.tool(). From the agent's perspective the tools look exactly the same; the MCP protocol handles the translation.
Imports and Server Instance
import httpx
from fastmcp import FastMCP
mcp = FastMCP("Weather & Air Quality")
FastMCP("Weather & Air Quality") creates the server and gives it a human-readable name that appears in the MCP handshake.
Coordinate Helper and Tools
The _get_coordinates helper and the two tool functions are unchanged from the previous chapter. The only edit is swapping @tool for @mcp.tool().
def _get_coordinates(location: str) -> tuple[float, float]:
"""Resolve a place name to (latitude, longitude) via the Open-Meteo Geocoding API."""
response = httpx.get(
"https://geocoding-api.open-meteo.com/v1/search",
params={"name": location, "count": 1, "language": "en", "format": "json"},
)
data = response.json()
if "results" in data and len(data["results"]) > 0:
latitude = data["results"][0]["latitude"]
longitude = data["results"][0]["longitude"]
return latitude, longitude
else:
raise ValueError(f"Could not find coordinates for location: {location}")
@mcp.tool()
def get_air_quality(location: str) -> str:
"""Get air quality information based on a location."""
latitude, longitude = _get_coordinates(location)
response = httpx.get(
"https://air-quality-api.open-meteo.com/v1/air-quality",
params={
"latitude": latitude,
"longitude": longitude,
"hourly": "pm10,pm2_5",
"forecast_days": 1,
},
)
data = response.json()
if "hourly" in data and "pm10" in data["hourly"] and "pm2_5" in data["hourly"]:
pm10 = data["hourly"]["pm10"][0]
pm2_5 = data["hourly"]["pm2_5"][0]
result = f"PM10: {pm10} ÎĽg/mÂł, PM2.5: {pm2_5} ÎĽg/mÂł"
else:
result = "Air quality data not available"
return f"Air quality in {location}: {result}"
@mcp.tool()
def get_temperature(location: str) -> str:
"""Get the current temperature for a location."""
latitude, longitude = _get_coordinates(location)
response = httpx.get(
"https://api.open-meteo.com/v1/forecast",
params={
"latitude": latitude,
"longitude": longitude,
"hourly": "temperature_2m",
"forecast_days": 1,
},
)
data = response.json()
if "hourly" in data and "temperature_2m" in data["hourly"]:
temperature = data["hourly"]["temperature_2m"][0]
result = f"Temperature: {temperature} °C"
else:
result = "Temperature data not available"
return f"Temperature in {location}: {result}"
Entry Point
We used the HTTP transport in most if not all previous servers, let's switch to stdio for this one for the sake of variety.
if __name__ == "__main__":
mcp.run(transport="stdio", show_banner=False)
transport="stdio"tells FastMCP to communicate over standard input/output. The agent will launch this script as a subprocess and talk to it over stdio — no network port needed.show_banner=Falsesuppresses the startup log so the stdio stream stays clean.
Complete Server Code
cat > $HOME/workspace/langchain/langchain_agent_with_mcp/server.py <<EOF
# server.py
import httpx
from fastmcp import FastMCP
mcp = FastMCP("Weather & Air Quality")
def _get_coordinates(location: str) -> tuple[float, float]:
"""Resolve a place name to (latitude, longitude) via the Open-Meteo Geocoding API."""
response = httpx.get(
"https://geocoding-api.open-meteo.com/v1/search",
params={"name": location, "count": 1, "language": "en", "format": "json"},
)
data = response.json()
if "results" in data and len(data["results"]) > 0:
latitude = data["results"][0]["latitude"]
longitude = data["results"][0]["longitude"]
return latitude, longitude
else:
raise ValueError(f"Could not find coordinates for location: {location}")
@mcp.tool()
def get_air_quality(location: str) -> str:
"""Get air quality information based on a location."""
latitude, longitude = _get_coordinates(location)
response = httpx.get(
"https://air-quality-api.open-meteo.com/v1/air-quality",
params={
"latitude": latitude,
"longitude": longitude,
"hourly": "pm10,pm2_5",
"forecast_days": 1,
},
)
data = response.json()
if "hourly" in data and "pm10" in data["hourly"] and "pm2_5" in data["hourly"]:
pm10 = data["hourly"]["pm10"][0]
pm2_5 = data["hourly"]["pm2_5"][0]
result = f"PM10: {pm10} ÎĽg/mÂł, PM2.5: {pm2_5} ÎĽg/mÂł"
else:
result = "Air quality data not available"
return f"Air quality in {location}: {result}"
@mcp.tool()
def get_temperature(location: str) -> str:
"""Get the current temperature for a location."""
latitude, longitude = _get_coordinates(location)
response = httpx.get(
"https://api.open-meteo.com/v1/forecast",
params={
"latitude": latitude,
"longitude": longitude,
"hourly":Practical MCP with FastMCP & LangChain
Engineering the Agentic ExperienceEnroll now to unlock current content and receive all future updates for free. Your purchase supports the author and fuels the creation of more exciting content. Act fast, as the price will rise as the course nears completion!
