Pages

Wednesday, December 31, 2025

MCP (Model Context Protocol)

Here is the breakdown of the Model Context Protocol (MCP), an open standard designed to simplify how AI models connect to data and tools.

The "USB-C Moment" for AI

In the past, connecting a mouse or keyboard to a computer required specific, messy wires for every device. Today, we have USB-C—a single port that works for everything.

MCP is the USB-C for AI. It replaces "Glue Code" (manual, messy code written to connect an LLM to a specific API like Yahoo Finance) with a universal standard.


Core Components of MCP

The protocol relies on three main "actors" to work:

  1. MCP Hosts: These are the AI applications (like Claude Desktop or a custom chatbot) that want to use data.

  2. MCP Clients: The software inside the host that maintains the connection to servers.

  3. MCP Servers: Small programs that "expose" specific tools or data (like Google Maps, Todoist, or a local database) to the AI.


3 Key Capabilities (What Servers Offer)

When an MCP client connects to a server, it asks for a list of three things:

  • Tools: Executable functions the AI can trigger (e.g., "Search for hiking trails in Ladakh" or "Book a flight").

  • Resources: Static data or knowledge (e.g., a PDF file, a database record, or a local text file).

  • Prompts: Pre-written templates provided by the server to help the AI engineer interact with the data more effectively.


Technical Workflow: How it Works

  1. Discovery: When you start your chatbot, the MCP Client calls list_tools. The Google Maps Server responds with a list of tools and, crucially, detailed descriptions of what they do.

  2. Intelligence: The LLM reads these descriptions. Because LLMs understand language, they can figure out that if a user asks about "hiking in Ladakh," they should use the map_search_places tool.

  3. Extraction: The LLM automatically pulls the required parameters (like Latitude/Longitude) from the user's natural language question.

  4. Execution: The client sends the request to the server. The server acts as a wrapper—it makes the actual API call (like a REST call to Google) and returns the data to the AI in a standardized format.


Usage: Why does this matter?

  • No More Maintenance Nightmares: In the old way, if Yahoo Finance changed its API, every developer had to update their "glue code." With MCP, only the MCP Server needs to be updated. All 10,000 developers using that server get the fix automatically.

  • Security: MCP allows AI to interact with local data (files on your laptop) without having to upload those files to the cloud.

  • Standardization: It uses a common "Schema" (input rules) so that communication between different AI models and different servers is always predictable.


Summary Table

FeatureOld Way (Glue Code)New Way (MCP)
EffortHigh; manual coding for every API.Low; connect to a pre-built server.
MaintenanceHard; breaks if the external API changes.Easy; maintenance is centralized at the server level.
FlexibilityRigid; hard to swap one tool for another.High; "Plug-and-play" like a USB device.
IntegrationCustom Python/Typescript logic.Universal Standard (JSON-based protocol).


What Problem MCP Is Solving (In Simple Terms)

AI applications today don’t just use an LLM.
They also need:

  • APIs (Google Maps, Yahoo Finance, Todoist, etc.)

  • Databases

  • Files

  • Prompts

To connect all these, developers write a lot of “glue code” — custom code that:

  • Calls APIs

  • Formats inputs

  • Parses outputs

  • Breaks when APIs change

As more AI apps are built, maintaining this glue code becomes a nightmare.

MCP solves this by standardizing how LLMs talk to tools, data, and prompts.


Evolution of AI Applications

Stage 1: Plain LLMs

  • LLM answers questions using training data

  • Cannot fetch live data

Stage 2: Agentic AI

  • LLM + tools (APIs, search, databases)

  • Developers write glue code

  • Examples: CrewAI, LangChain

Stage 3: MCP (Current Shift)

  • Standard protocol for tools & data

  • LLM talks to tools in a uniform way

  • Less glue code, easier maintenance

MCP is the “USB-C moment” for AI
(One standard interface instead of many custom wires)


Simple Real-World Example (Equity Research)

Problem

An equity analyst wants an AI app that:

  • Describes Nvidia & Tesla

  • Pulls latest stock prices

  • Summarizes financial metrics

  • Shows recent news

Without MCP

  • Developer writes custom code for:

    • Yahoo Finance API

    • Web search

    • Internal PDFs

  • Every API change breaks the app

  • Thousands of developers repeat the same work

With MCP

  • Yahoo Finance exposes an MCP server

  • Google Search exposes an MCP server

  • AI app connects once using MCP

  • All tools work in a standard way

Faster development
Less maintenance
Shared ecosystem


What MCP Actually Is (Core Concept)

MCP = A standard way for LLMs to discover and use capabilities

An MCP system has:

  • MCP Client → Your AI app / chatbot

  • MCP Servers → Tools, APIs, databases

At startup, the client asks servers:

  • “What can you do?”


Three Things Every MCP Server Exposes

Tools (Actions)

Examples:

  • Search places

  • Get stock prices

  • Create a task

  • Fetch a webpage

Each tool includes:

  • Description (for LLM understanding)

  • Input schema (parameters)

  • Output format


Resources (Knowledge)

Examples:

  • Files (PDFs, CSVs)

  • Databases

  • Cloud storage (S3)

Think of resources as readable knowledge sources.


Prompts (Pre-built instructions)

Examples:

  • “Summarize stock performance”

  • “Create financial comparison”

  • “Explain trends for executives”

Prompts are shared so:

  • App developers don’t reinvent them

  • LLMs get consistent instructions


How MCP Works at Runtime (Step-by-Step)

  1. AI app (MCP client) starts

  2. It calls:

    • list_tools

    • list_resources

    • list_prompts

  3. MCP servers respond with descriptions

  4. LLM reads tool descriptions

  5. User asks a question

  6. LLM:

    • Picks the right tool

    • Extracts parameters automatically

    • Calls the tool

    • Reads response

    • Answers user

No hard-coded logic needed


Example: Google Maps + Todoist

User asks:

“I’m going hiking in Leh. Show nearby places and create a to-do list.”

LLM automatically:

  • Uses Google Maps MCP server to find places

  • Uses Todoist MCP server to create tasks

  • Extracts location, latitude, longitude from text

  • Chains results together

Natural language → Action

No manual parameter wiring


Important Clarification

MCP does NOT replace:

  • REST APIs

  • HTTP

  • Existing services

Instead, it wraps them with:

  • Standard schemas

  • Predictable behavior

  • LLM-friendly descriptions


Why MCP Is Powerful

Before MCP

  • Thousands of developers write the same integration

  • High maintenance cost

  • Fragile systems

With MCP

  • Tool providers write integration once

  • Everyone reuses it

  • Easier upgrades

  • Cleaner AI architecture


Practical Usage Scenarios

Enterprise Chatbots

  • HR bot (payroll, leave, policies)

  • IT support bot (tickets, diagnostics)

Finance & Research

  • Stock analysis

  • Market summaries

  • Automated reports

Productivity Apps

  • Calendar + Email + Tasks

  • Travel planning

  • Meeting automation

Internal AI Platforms

  • Secure database access

  • File search

  • Knowledge retrieval


No comments:

Post a Comment