Building an Advanced Netflix MCP: Introduction and Setup
FastMCP Netflix Server-Client Architecture
What Are We Building?
Imagine you want to give an AI assistant like ChatGPT the ability to query Netflix viewing data, manage a user's favorite movies, and perform analyses on viewing trends. The AI needs to access a PostgreSQL database containing millions of viewing records across thousands of movies and TV shows.
We are going to use the PostgreSQL database system for this project with a sample dataset of Netflix movies. The DB will run in a Docker container on our local machine.
The sample database is based on the data from the Netflix Engagement Report and the Netflix Global Top 10 weekly list, it includes movies and TV shows for learning and practice purposes.
The database includes information about 11,960 movies, 8,596 TV show seasons, and 37,385 viewing summary records spanning from June 2021 to December 2024. We're going to focus solely on movies for this project to keep things manageable, but the architecture we build will be flexible enough to support TV shows and other content types in the future.
But here's the challenge: the AI can't directly access databases, and you don't want to expose your database credentials to external services.
This is where the Model Context Protocol comes in. We're building a server that sits between the AI and your database, exposing specific capabilities through a standardized protocol. The server speaks MCP on one side and SQL on the other. A client application connects the AI to this server, translating the AI's needs into MCP requests and presenting the results back in a way the AI can understand.
Our project has three main participants. The server exposes Netflix viewing data through MCP, providing tools like "search movies by title" and "get top movies by viewing hours". The client creates a conversational interface where users can ask natural language questions like What are the most-watched movies? The client uses OpenAI's GPT models to understand user intent and decide which server tools to call.
Understanding the Project Structure
Let's start by understanding how the code is organized. The project is divided into 3 main areas, each with a specific purpose.
The
server/directory contains everything that runs on the server side. This is where the MCP protocol implementation lives.Inside, you'll find
main.py, which serves as the entry point where we configure and launch the server.- The
database.pyfile handles all database connectivity, defining the SQLAlchemy models that represent our tables. Perhaps most importantly, there's a
components/subdirectory that contains the three types of MCP primitives we can expose: tools (executable functions), resources (readable documents), and prompts (analysis templates).The
client/directory contains the code that runs on the user's machine.The
main.pyfile here creates an interactive REPL (Read-Eval-Print Loop) where users can ask questions in plain English. This client talks to both our MCP server and OpenAI's API, acting as a bridge between human language and structured tool calls.- Inside the
handlers/subdirectory, you'll find code that responds to various types of requests the server might make back to the client, such as asking the user to clarify which movie they meant when there are multiple matches.
The client doesn't know anything about databases or SQL. Each component has a clear job, and they communicate through the standardized MCP protocol.
This is the file tree of the project:
├── client
│  ├── handlers
│  │  ├── elicitation.py
│  │  ├── __init__.py
│  │  ├── logging.py
│  │  ├── progress.py
│  │  └── sampling.py
│  ├── main.py
│  ├── pyproject.toml
├── server
│  ├── components
│  │  ├── __init__.py
│  │  ├── prompts.py
│  │  ├── resources.py
│  │  └── tools.py
│  ├── database.py
│  ├── fastmcp.json
│  ├── main.py
│  ├── pyproject.toml
Every file and directory has a specific purpose. Here's a quick overview of what each one does:
Server:
server/main.py: Configures and runs the MCP server, registers all tools/resources/prompts, sets up database connections and middleware.
The server entry point:
Practical MCP with FastMCP & LangChain
Engineering the Agentic ExperienceEnroll now to unlock current content and receive all future updates for free. Your purchase supports the author and fuels the creation of more exciting content. Act fast, as the price will rise as the course nears completion!
