Your First MCP Client
OpenAI as MCP Orchestrator
In the previous example, we built the orchestration logic (the tool calling loop) inside our MCP client. This works well, but there's another way to do it. We can actually use OpenAI itself as "the orchestrator".
Instead of having the client manage the conversation and tool calls, we can give OpenAI the power to decide when to call tools and when to ask for more information. This way, we can write a much simpler client that just forwards messages between OpenAI and the MCP server.
Since this "orchestration" is run on the OpenAI side, we need to expose the MCP server to the public internet. There are some solutions at this level, let's name some:
We can do this at the server code-level by using
mcp_server = FastMCP("CalculatorServer", host="x.x.x.x")wherex.x.x.xis the public IP address of your machine. In this case, we need also to handle SSL/TLS encryption if needed.Deploy a server using a domain name and an SSL certificate. This is the most robust solution but it requires more setup and maintenance.
We can also use a tool like
ngrokto create a tunnel to our machine without changing the server code. This is especially useful if you're behind a NAT or firewall. There's no need to setup SSL/TLS sincengrokprovides a secure tunnel with HTTPS endpoints.
We'll use ngrok in this example. Install the tool if you haven't already. Then run the following command:
# Run the server
cd $HOME/workspace/mcp-server
uv run calculator_server.py
# In another terminal, run ngrok to expose it to the internet
ngrok http 8000 \
--host-header=localhost:8000
The ngrok command will give you a public URL (something like https://abc123.ngrok-free.app) that tunnels to your local server at http://localhost:8000. You can use this URL in your OpenAI API requests to allow the model to communicate with your MCP server.
The --host-header=localhost:8000 flag tells ngrok to rewrite the Host header before forwarding requests to your local MCP server, so the server sees Host: localhost:8000 instead of the ngrok domain (which it might not recognize).
Now, let's write a new application that instructs OpenAI to call the MCP server directly:
cat <<EOF > $HOME/workspace/mcp-client/simple_client.py
from openai import OpenAI
import os
import sys
openai = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
ngrok_url = os.getenv("MCP_SERVER_URL")
# Get the query from command-line arguments
if len(sys.argv) < 2:
print("Usage: python simple_client.py ''" )
sys.exit(1)
query = sys.argv[1]
response = openai.responses.create(
model="gpt-5-mini",Practical MCP with FastMCP & LangChain
Engineering the Agentic ExperienceEnroll now to unlock current content and receive all future updates for free. Your purchase supports the author and fuels the creation of more exciting content. Act fast, as the price will rise as the course nears completion!
