Description
Describe the new feature or enhancement
I have developed an EEG MCP (Model Context Protocol) Server that integrates MNE-Python for EEG data processing and uses LLMs (Large Language Models) to enable intelligent EEG-based interactions.
The server exposes endpoints like /read, /filter, /features, /visualize, and /summarize, and is connected to an LLM tool agent that chooses when to call these endpoints based on user queries.
This shows how MNE can be leveraged in real-time, AI-driven pipelines and can greatly help others build intelligent EEG systems quickly.
Demo Video: https://www.youtube.com/shorts/bWYk8Bbx7Uk
Describe your proposed implementation
I propose adding this MCP EEG Server integration as either:
- A folder under examples/integrations/mcp_server/, or
- A documentation reference/link for “real-time EEG server pipelines using MNE”
The implementation includes:
A MCP server that uses MNE to:
- Read EDF files with mne.io.read_raw_edf
- Apply filters and extract channel data, features
- Visualize EEG plots programmatically
- An interface for external LLM tools to call these endpoints intelligently.
This bridges the gap between EEG software and AI agents, and shows MNE’s value in modern intelligent applications.
Describe possible alternatives
There is no official MCP-based EEG API server in MNE or real-time LLM-interfacing example.
This project combines the strengths of MNE with tool-driven intelligence and provides a great entry point for researchers and developers to build real-time, AI-enhanced EEG platforms.
Additional context
No response