-
Notifications
You must be signed in to change notification settings - Fork 250
[WIP] feat: mcp server prototype #588
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
My intuition tells me a user-editable AGENTS.md and functional LSP linting gives a lot more value than an MCP server. Ripple-bench? Consensus is brewing among LLM user/providers that MCP was brilliant on paper, but it makes bots less intelligent and more costly in practice. For example, I'd wager none of the currently included tools improves performance to justify including them in all LLM requests. |
|
@jsudelko MCP server would be very beneficial for the fast-changing early-stage framework as we are not gonna be well recognized by the LLMs (yet). This will add additional layer of knowledge for the LLM to work without polluting AGENTS.md / llms.txt files (as they need to stay as clean as possible). Here the role of the MCP is to perform some of the actions and checks for the user. By giving it a knowledge base, we are making sure that our rapid schema changing will not affect the agentic coding capabilities. We can argue about how effective or ineffective the MCP servers are, but that's not the point. We need a tool that would give the users quicker and faster way to jump/propose/orchestrate their work and reduce initial hassle with some of our own internal choices (for example "@" [reactivity], different mental model, compat layers etc.). This server will have more features (such as auto-refactor for react-like modules/components that plays well with LLMs like Claude / GPT / Gemini and allow users faster adoption for their codebases). For now it's size it's not that bloated, as the complete tools definition is roughly ~316 tokens in size. Another benefit is that you as an user, have the choice to either connect the MCP server or use pure agentic workflow with AGENTS.md / TASKS.md files. |
This PR introduces
@ripple-ts/mcp-server, aModel Context Protocol (MCP)server that exposes Ripple's compiler capabilities to the LLMsWhat it brings: