0:00
/
0:00
Transcript

Building a 100% local MCP Client

...with complete code walkthrough and explanation.

An MCP client is a component within an AI application (like Cursor) that establishes standardized connections to external tools and data sources via the Model Context Protocol (MCP).

Today, let us show you how it is built 100% locally.

Tech stack:

  • LlamaIndex to build the MCP-powered Agent.

  • Ollama to locally serve Deepseek-R1.

  • LightningAI for development and hosting.

Here's our workflow:

  • The user submits a query.

  • Agent connects to the MCP server to discover tools.

  • Based on the query, the agent invokes the right tool and gets context

  • Agent returns a context-aware response.

The code is available in this Studio: Build a 100% local MCP Client. You can run it without any installations by reproducing our environment below:

Let’s implement this!


Build an SQLite MCP Server

For this demo, we've built a simple SQLite server with two tools:

  • add data

  • fetch data

This is done to keep things simple, but the client we're building can connect to any MCP server out there.

Set Up LLM

We'll use a locally served Deepseek-R1 via Ollama as the LLM for our MCP-powered agent.

Define system prompt

We define our agent’s guiding instructions to use tools before answering user queries.

Feel free to tweak this to fit your requirements.

Define the Agent

We define a function that builds a typical LlamaIndex agent with its appropriate arguments.

The tools passed to the agent are MCP tools, which LlamaIndex wraps as native tools that can be easily used by our FunctionAgent.

Define Agent Interaction

We pass user messages to our FunctionAgent with a shared Context for memory, stream tool calls, and return its reply.

We manage all the chat history and tool calls here.

Initialize MCP Client and the Agent

Launch the MCP client, load its tools, and wrap them as native tools for function-calling agents in LlamaIndex.

Then, pass these tools to the agents and add the context manager.

Run the Agent

Finally, we start interacting with our agent and get access to the tools from our SQLite MCP server.

As shown above:

  • When we say something like "Add Rafael Nadal...," the agent understands the intent, generates a corresponding SQL INSERT command, and stores the data in the database.

  • When we say "fetch data," it runs a SELECT query and retrieves the data.

  • It then presents the result back to the user in a readable format.

And there you go, we have built our 100% local MCP client!

The code is available in this Studio: Build a 100% local MCP Client. You can run it without any installations by reproducing our environment below:

Thanks for reading!