Warning
LangSmith MCP Server is under active development and many features are not yet implemented.
A production-ready Model Context Protocol (MCP) server that provides seamless integration with the LangSmith observability platform. This server enables language models to fetch conversation history and prompts from LangSmith.
The LangSmith MCP Server bridges the gap between language models and the LangSmith platform, enabling advanced capabilities for conversation tracking, prompt management, and analytics integration.
-
Install uv (a fast Python package installer and resolver):
curl -LsSf https://astral.sh/uv/install.sh | sh
-
Clone this repository and navigate to the project directory:
git clone https://github.com/langchain-ai/langsmith-mcp-server.git cd langsmith-mcp-server
Once you have the LangSmith MCP Server, you can integrate it with various MCP-compatible clients. You have two installation options:
-
Install the package:
uv run pip install --upgrade langsmith-mcp-server
-
Add to your client MCP config:
{ "mcpServers": { "LangSmith API MCP Server": { "command": "/path/to/uvx", "args": [ "langsmith-mcp-server" ], "env": { "LANGSMITH_API_KEY": "your_langsmith_api_key" } } } }
Add the following configuration to your MCP client settings:
{
"mcpServers": {
"LangSmith API MCP Server": {
"command": "/path/to/uvx",
"args": [
"--directory",
"/path/to/langsmith-mcp-server/langsmith_mcp_server",
"run",
"server.py"
],
"env": {
"LANGSMITH_API_KEY": "your_langsmith_api_key"
}
}
}
}
Replace the following placeholders:
/path/to/uv
: The absolute path to your uv installation (e.g.,/Users/username/.local/bin/uv
). You can find it runningwhich uv
./path/to/langsmith-mcp-server
: The absolute path to your langsmith-mcp project directoryyour_langsmith_api_key
: Your LangSmith API key
Example configuration:
{
"mcpServers": {
"LangSmith API MCP Server": {
"command": "/Users/mperini/.local/bin/uvx",
"args": [
"langsmith-mcp-server"
],
"env": {
"LANGSMITH_API_KEY": "lsv2_pt_1234"
}
}
}
}
Copy this configuration in Cursor > MCP Settings.
If you want to develop or contribute to the LangSmith MCP Server, follow these steps:
-
Create a virtual environment and install dependencies:
uv sync
-
To include test dependencies:
uv sync --group test
-
View available MCP commands:
uvx langsmith-mcp-server
-
For development, run the MCP inspector:
uv run mcp dev langsmith_mcp_server/server.py
- This will start the MCP inspector on a network port
- Install any required libraries when prompted
- The MCP inspector will be available in your browser
- Set the
LANGSMITH_API_KEY
environment variable in the inspector - Connect to the server
- Navigate to the "Tools" tab to see all available tools
-
Before submitting your changes, run the linting and formatting checks:
make lint make format
The server enables powerful capabilities including:
- 💬 Conversation History: "Fetch the history of my conversation with the AI assistant from thread 'thread-123' in project 'my-chatbot'"
- 📚 Prompt Management: "Get all public prompts in my workspace"
- 🔍 Smart Search: "Find private prompts containing the word 'joke'"
- 📝 Template Access: "Pull the template for the 'legal-case-summarizer' prompt"
- 🔧 Configuration: "Get the system message from a specific prompt template"
The LangSmith MCP Server provides the following tools for integration with LangSmith:
Tool Name | Description |
---|---|
list_prompts |
Fetch prompts from LangSmith with optional filtering. Filter by visibility (public/private) and limit results. |
get_prompt_by_name |
Get a specific prompt by its exact name, returning the prompt details and template. |
get_thread_history |
Retrieve the message history for a specific conversation thread, returning messages in chronological order. |
get_project_runs_stats |
Get statistics about runs in a LangSmith project, either for the last run or overall project stats. |
fetch_trace |
Fetch trace content for debugging and analyzing LangSmith runs using project name or trace ID. |
list_datasets |
Fetch LangSmith datasets with filtering options by ID, type, name, or metadata. |
list_examples |
Fetch examples from a LangSmith dataset with advanced filtering options. |
read_dataset |
Read a specific dataset from LangSmith using dataset ID or name. |
read_example |
Read a specific example from LangSmith using the example ID and optional version information. |
This project is distributed under the MIT License. For detailed terms and conditions, please refer to the LICENSE file.
Made with ❤️ by the LangChain Team