Войти
  • 53667Просмотров
  • 6 месяцев назадОпубликованоAkshay Pachaar

MCP meets Ollama: Build a 100% local MCP client

I just built a 100% local MCP client. (that you can connect to any MCP server) We often use Cursor IDE or Clause Desktop as MCP hosts, where the client relies on an external LLM (Claude Sonnet, GPT-4, etc.). While these tools are excellent, there are cases—especially when handling sensitive data—where fully secure and private MCP clients are essential. The MCP client that we're building today is powered by local LLMs. Tech stack: - LlamaIndex to build the MCP-powered Agent - Ollama to locally serve Deepseek-R1. - Lightning AI for development and hosting Here's an overview of how it works: - User submits a query. - Agent connects to the MCP server to discover tools. - Based on the query, the agent invokes the right tool and gets context - Agent returns a context-aware response. The video gives you a complete step-by-step walkthrough of how to build this, along with code explanations. You can find all the code in this GitHub repo: I have also published it as a LightningAI Studio, where you have everything setup up and ready to run: #mcp #llm #ai #aiagents #ollama