This repository demonstrates how to implement the Model Context Protocol (MCP) architecture to connect Large Language Models (LLMs) with multiple external tools.
The Model Context Protocol (MCP) is a standardized way for LLMs to interact with external tools, resources, and prompts. This architecture allows for:
- Flexible tool integration: Connect any service with your LLM
- Standardized communication: Use a consistent protocol for all tool interactions
- Scalable architecture: Add new tools without changing the LLM or core system
- Multi-server support: Connect to multiple tool providers simultaneously
This demo consists of four main components:
-
Calculator Server (
server.js
): Implements a simple calculator service that provides:- Mathematical operations as tools (add, subtract, multiply, divide)
- Documentation and history as resources
- Prompts for calculations and documentation retrieval
-
Weather Server (
weather-server.js
): Implements a weather information service that provides:- Weather data tools (getCurrentWeather, getWeatherForecast)
- Documentation and query history as resources
- Prompts for weather reports
-
MCP Client (
mcp-client.js
):- Generic MCP client for connecting to MCP servers
- Provides a clean API to discover and use server capabilities
- Handles communication details with LLM-friendly methods
-
LLM Host Application (
llm-host-app.js
):- Connects to the OpenAI API
- Integrates with multiple MCP servers simultaneously
- Routes tool calls to the appropriate server
- Converts MCP tools and prompts to OpenAI function calling format
- Provides an interactive chat interface
├── server.js # Calculator MCP Server implementation
├── weather-server.js # Weather MCP Server implementation
├── mcp-client.js # MCP Client library for connecting to servers
├── llm-host-app.js # LLM Host App that integrates with OpenAI
├── package.json # Project dependencies and scripts
├── .env # Environment variables (not committed to Git)
├── .env.example # Example environment variables file
├── .gitignore # Git ignore configuration
└── README.md # This documentation
- Node.js (v14 or newer)
- NPM (v6 or newer)
- OpenAI API key (for the LLM host application)
-
Clone this repository
git clone https://github.com/truffle-ai/mcp-demo.git cd mcp-demo
-
Install dependencies
npm install
-
Create a
.env
file with your OpenAI API key (copy from .env.example)cp .env.example .env # Then edit .env to add your actual API key
This repository is already configured for Git with:
- A
.gitignore
file to exclude node_modules, .env files, and other unnecessary files - Sample environment variables in
.env.example
(copy to.env
for local use)
If you're setting up a new clone:
git init
git add .
git commit -m "Initial commit"
git branch -M main
git remote add origin https://github.com/truffle-ai/mcp-demo.git
git push -u origin main
Start the interactive LLM-powered multi-server chat:
npm start
This will automatically start both the calculator and weather servers, connect to them via the MCP client, and let you interact with them through the OpenAI LLM.
If you want to start the servers individually:
# Start just the calculator server
npm run start-calculator
# Start just the weather server
npm run start-weather
The host application can connect to multiple MCP servers simultaneously:
┌───────────────┐
│ Calculator │
│ MCP Server │
┌───────────────┐ ┌─────────┴───────────────┴─────────┐
│ LLM Host App │ │ │
│ (OpenAI API) │◄───►│ MCP Client with Server Router │
└───────────────┘ │ │
└─────────┬───────────────┬─────────┘
│ Weather │
│ MCP Server │
└───────────────┘
Key features of the multi-server architecture:
- Server Router: The host app maintains a mapping of tool names to servers
- Tool Namespacing: Tools from different servers are labeled with their source
- Parallel Connections: Multiple server connections are maintained simultaneously
- Unified Interface: The LLM sees a single unified set of tools
- Multi-Server Routing: Direct tool calls to the correct server
- Tool Discovery: Dynamically discover available tools from multiple servers
- Resource Access: Read documentation and history from different servers
- Prompt Templates: Use server-defined prompt templates
- Error Handling: Proper handling of errors such as division by zero or invalid locations
- LLM Integration: Connect an actual LLM to multiple MCP servers
You can use this demo as a template to create your own MCP-compatible multi-server architecture:
- Copy and modify
server.js
orweather-server.js
to implement your own tools - Use the existing
mcp-client.js
to connect to your servers - Add additional server connections to
llm-host-app.js
MIT