Skip to content

MCPServerStdio bug when passing env vars as param #1200

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
fcestari opened this issue Mar 21, 2025 · 2 comments
Closed
1 task done

MCPServerStdio bug when passing env vars as param #1200

fcestari opened this issue Mar 21, 2025 · 2 comments

Comments

@fcestari
Copy link

fcestari commented Mar 21, 2025

Initial Checks

  • I confirm that I'm using the latest version of Pydantic AI

Description

Hi. I am having issues to start an mcp server with the class MCPServerStdio whenever I pass a dict[str, str] to the env parameter.

I can confirm that I have npx installed and have used with success the MCPServerStdio without passing the env parameter.

Example Code

import asyncio

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.openai import OpenAIProvider
from pydantic_ai.mcp import MCPServerStdio

github_server = MCPServerStdio(
    command="npx",
    args=["@modelcontextprotocol/server-github"],
    env={"GITHUB_PERSONAL_ACCESS_TOKEN": "<github-pat-here>"},
)

ollama_model = OpenAIModel(
    model_name="llama3.2", provider=OpenAIProvider(base_url="http://localhost:11434/v1")
)

agent = Agent(ollama_model, mcp_servers=[github_server])


async def main():
    async with agent.run_mcp_servers():
        result = await agent.run("list issues from repo XXX")
    print(result.data)


if __name__ == "__main__":
    asyncio.run(main())

---
Error:

File "/opt/homebrew/Cellar/[email protected]/3.13.2/Frameworks/Python.framework/Versions/3.13/lib/python3.13/subprocess.py", line 1974, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'npx'

Python, Pydantic AI & LLM client version

3.13, 0.0.43, openai v1.68.0 (azure openai and ollama)
@Kludex
Copy link
Member

Kludex commented Mar 21, 2025

It was fixed on mcp's side: modelcontextprotocol/python-sdk#327

If you bump the mcp dependency to 1.5.0, it should work as expected (the release is in progress, should be available in some hours).

@Kludex Kludex closed this as completed Mar 21, 2025
@fcestari
Copy link
Author

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants