This Docker Compose setup creates a comprehensive MCP (Model Context Protocol) server aggregation system that exposes multiple MCP servers through a single OpenAI-compatible endpoint.
Note: This setup now includes all your key requested MCP servers: HomeAssistant, YNAB, Notion, Fetch, Search, Calculator, and Memory (using the official knowledge graph-based persistent memory system).
Individual MCP Servers (Docker containers)
├── HomeAssistant MCP (Python stdio → HTTP)
├── YNAB MCP (FastMCP HTTP)
├── Notion MCP (Python stdio → HTTP)
├── Fetch MCP (Node.js stdio → HTTP)
├── Search MCP (Python stdio → HTTP)
└── Calculator MCP (Python stdio → HTTP)
↓
FastMCP Aggregator
(Combines all MCP servers)
↓
MCPO
(OpenAI tool server wrapper)
↓
OWUI/Clients
Each MCP server runs in its own isolated container:
- mcp-homeassistant (Port 3001): Home Assistant integration using voska/hass-mcp
- mcp-ynab (Port 3002): YNAB (You Need A Budget) integration using ntdef/ynab-mcp
- mcp-notion (Port 3003): Notion workspace integration using makenotion/notion-mcp-server
- mcp-fetch (Port 3004): Web content fetching from modelcontextprotocol/servers
- mcp-search (Port 3005): Web search functionality using mrkrsl/web-search-mcp
- mcp-calculator (Port 3006): Mathematical calculations using githejie/mcp-server-calculator
- mcp-memory (Port 3007): Knowledge graph-based persistent memory using modelcontextprotocol/servers
- mcp-aggregator (Port 3100): FastMCP instance that combines all individual MCPs
- mcpo (Port 8080): MCPO wrapper providing OpenAI-compatible API
-
Clone and configure:
# Windows build.bat # Linux/Mac ./build.sh
-
Update environment variables: Edit
.env
file with your actual API keys and tokens:HASS_URL=http://your-homeassistant:8123 HASS_TOKEN=your_token YNAB_TOKEN=your_ynab_token NOTION_TOKEN=your_notion_token SEARCH_API_KEY=your_search_key
-
Start the stack:
docker-compose up -d
-
Access the API:
- OpenAI-compatible endpoint:
http://localhost:8080
- Individual MCPs:
http://localhost:300X
(where X is 1-6) - Aggregator:
http://localhost:3100
- OpenAI-compatible endpoint:
The main endpoint at localhost:8080
provides an OpenAI-compatible tool server that can be used with OWUI or any other OpenAI-compatible client.
- Create a new service directory:
services/mcp-newserver/
- Add Dockerfile and any proxy scripts needed
- Update
docker-compose.yml
with the new service - Update the aggregator configuration to include the new endpoint
base-images/python-proxy/
: For Python stdio MCP serversbase-images/node-proxy/
: For Node.js stdio MCP servers
- Each MCP runs in isolation for security and environment separation
- All stdio-based MCPs are wrapped with FastMCP to provide HTTP endpoints
- Only the final MCPO endpoint (port 8080) needs to be exposed publicly
- Internal communication happens over Docker network
- Check individual MCP health:
curl http://localhost:300X/health
(where X is 1-6) - View logs:
docker-compose logs mcp-servicename
- Restart specific service:
docker-compose restart mcp-servicename
Port Conflicts:
- If ports 3001-3006 or 8080 are in use, update
docker-compose.yml
to use different ports - Make sure to update the aggregator environment variables accordingly
API Token Issues:
- HomeAssistant: Create a Long-Lived Access Token in Home Assistant
- YNAB: Get your API token from YNAB Developer Settings
- Notion: Create an integration at Notion Integrations
Build Failures:
- Ensure Docker has sufficient resources (4GB+ RAM recommended)
- Some servers require internet access during build to clone repositories
- Check Docker logs:
docker-compose logs --tail=50 service-name
Note: The current proxy implementations are placeholders. For full functionality, they need to implement actual FastMCP stdio→HTTP proxying. This is a known limitation that requires additional development.