Building a Functional MCP Client
Sampling Handler
The sampling handler is responsible for fulfilling the server's requests for text generation. When the server calls ctx.sample(), it sends a request to the client, which then uses the sampling handler to generate a response using an LLM.
FastMCP provides built-in sampling handlers for popular LLM providers like OpenAI and Anthropic. Since we're using the OpenAI API in our client, we can use the OpenAISamplingHandler that FastMCP provides. This handler takes care of sending the prompt and parameters to OpenAI and returning the generated text back to the server.
Here is how we can set it up in our client:
cat > $HOME/workspace/puppy_guide/client/handlers/sampling.py << 'EOF'
import os
from dotenv import load_dotenv
fromPractical MCP with FastMCP & LangChain
Engineering the Agentic ExperienceEnroll now to unlock current content and receive all future updates for free. Your purchase supports the author and fuels the creation of more exciting content. Act fast, as the price will rise as the course nears completion!
