Skip to content

loycoder/mcp-demo

 
 

Repository files navigation

MCP Architecture Demo

This repository demonstrates how to implement the Model Context Protocol (MCP) architecture to connect Large Language Models (LLMs) with multiple external tools.

What is MCP?

The Model Context Protocol (MCP) is a standardized way for LLMs to interact with external tools, resources, and prompts. This architecture allows for:

  • Flexible tool integration: Connect any service with your LLM
  • Standardized communication: Use a consistent protocol for all tool interactions
  • Scalable architecture: Add new tools without changing the LLM or core system
  • Multi-server support: Connect to multiple tool providers simultaneously

Repository Components

This demo consists of four main components:

  1. Calculator Server (server.js): Implements a simple calculator service that provides:

    • Mathematical operations as tools (add, subtract, multiply, divide)
    • Documentation and history as resources
    • Prompts for calculations and documentation retrieval
  2. Weather Server (weather-server.js): Implements a weather information service that provides:

    • Weather data tools (getCurrentWeather, getWeatherForecast)
    • Documentation and query history as resources
    • Prompts for weather reports
  3. MCP Client (mcp-client.js):

    • Generic MCP client for connecting to MCP servers
    • Provides a clean API to discover and use server capabilities
    • Handles communication details with LLM-friendly methods
  4. LLM Host Application (llm-host-app.js):

    • Connects to the OpenAI API
    • Integrates with multiple MCP servers simultaneously
    • Routes tool calls to the appropriate server
    • Converts MCP tools and prompts to OpenAI function calling format
    • Provides an interactive chat interface

Project Structure

├── server.js                # Calculator MCP Server implementation
├── weather-server.js        # Weather MCP Server implementation
├── mcp-client.js            # MCP Client library for connecting to servers
├── llm-host-app.js          # LLM Host App that integrates with OpenAI
├── package.json             # Project dependencies and scripts
├── .env                     # Environment variables (not committed to Git)
├── .env.example             # Example environment variables file
├── .gitignore               # Git ignore configuration
└── README.md                # This documentation

Prerequisites

  • Node.js (v14 or newer)
  • NPM (v6 or newer)
  • OpenAI API key (for the LLM host application)

Installation

  1. Clone this repository

    git clone https://github.com/truffle-ai/mcp-demo.git
    cd mcp-demo
  2. Install dependencies

    npm install
  3. Create a .env file with your OpenAI API key (copy from .env.example)

    cp .env.example .env
    # Then edit .env to add your actual API key

Git Setup

This repository is already configured for Git with:

  • A .gitignore file to exclude node_modules, .env files, and other unnecessary files
  • Sample environment variables in .env.example (copy to .env for local use)

If you're setting up a new clone:

git init
git add .
git commit -m "Initial commit"
git branch -M main
git remote add origin https://github.com/truffle-ai/mcp-demo.git
git push -u origin main

Usage

Running the Multi-Server LLM Host App

Start the interactive LLM-powered multi-server chat:

npm start

This will automatically start both the calculator and weather servers, connect to them via the MCP client, and let you interact with them through the OpenAI LLM.

Running Individual Servers

If you want to start the servers individually:

# Start just the calculator server
npm run start-calculator

# Start just the weather server
npm run start-weather

Multi-Server Architecture

The host application can connect to multiple MCP servers simultaneously:

                                ┌───────────────┐
                                │ Calculator    │
                                │ MCP Server    │
┌───────────────┐     ┌─────────┴───────────────┴─────────┐
│  LLM Host App │     │                                   │
│ (OpenAI API)  │◄───►│   MCP Client with Server Router   │
└───────────────┘     │                                   │
                      └─────────┬───────────────┬─────────┘
                                │ Weather       │
                                │ MCP Server    │
                                └───────────────┘

Key features of the multi-server architecture:

  1. Server Router: The host app maintains a mapping of tool names to servers
  2. Tool Namespacing: Tools from different servers are labeled with their source
  3. Parallel Connections: Multiple server connections are maintained simultaneously
  4. Unified Interface: The LLM sees a single unified set of tools

Key Features Demonstrated

  • Multi-Server Routing: Direct tool calls to the correct server
  • Tool Discovery: Dynamically discover available tools from multiple servers
  • Resource Access: Read documentation and history from different servers
  • Prompt Templates: Use server-defined prompt templates
  • Error Handling: Proper handling of errors such as division by zero or invalid locations
  • LLM Integration: Connect an actual LLM to multiple MCP servers

Extending the Demo

You can use this demo as a template to create your own MCP-compatible multi-server architecture:

  1. Copy and modify server.js or weather-server.js to implement your own tools
  2. Use the existing mcp-client.js to connect to your servers
  3. Add additional server connections to llm-host-app.js

License

MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 100.0%