Exhub is an Elixir-powered enhancement plugin for Emacs, based on WebSocket communication. It facilitates real-time interaction and communication between Emacs and the Elixir server.
- WebSocket Communication: Establishes a bi-directional connection between Emacs and Elixir using WebSockets.
- Message Handling: Enables sending and receiving messages between Emacs and the Elixir server.
- Erlang/Elixir Backend: Leverages Elixir and Erlang for robust backend processing.
- Emacs Integration: Provides Emacs Lisp functions to interact seamlessly with the Elixir server.
- Agent Integration: Allows integration with agents for enhanced functionality.
- MCP Tools Integration: Provides integration with MCP Tools for extended functionality.
- Code Completion: LLM-powered code completion with dual modes: specialized prompts and various enhancements for chat-based LLMs on code completion tasks, and fill-in-the-middle (FIM) completion for compatible models.
- Advanced Configuration Management: Enhanced LLM configuration server with validation, error handling, and type specifications.
-
Clone the Repository:
git clone https://github.com/edmondfrank/exhub.git cd exhub -
Install Dependencies:
mix deps.get
-
Configuration:
The configuration for Exhub is managed in config/config.exs. Here are the relevant settings:
- LLM Configuration with DRY Approach:
# Store common values in variables instead of module attributes gitee_api_base = "https://ai.gitee.com/v1" gitee_api_key = "your api key" # Define LLM configurations using the DRY approach llms_config = %{ "openai/Qwen2.5-72B-Instruct" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/Qwen2.5-72B-Instruct" }, "openai/Qwen3-235B-A22B" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/Qwen3-235B-A22B" }, "openai/qwen3-next-80b-a3b-instruct" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/qwen3-next-80b-a3b-instruct" }, "openai/qwen3-235b-a22b-instruct-2507" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/qwen3-235b-a22b-instruct-2507" }, "openai/qwen3-coder-480b-a35b-instruct" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/qwen3-coder-480b-a35b-instruct" }, "openai/kimi-k2-instruct" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/kimi-k2-instruct" }, "openai/cursor/gpt-4o-mini" => %{ api_base: "http://127.0.0.1:9069/openai/v1", api_key: "your token", model: "openai/cursor/gpt-4o-mini" }, "openai/gpt-4o-mini" => %{ api_base: "http://localhost:4444/v1", api_key: "edmondfrank", model: "openai/gpt-4o-mini" }, "openai/deepseek-v3_1-terminus" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/deepseek-v3_1-terminus" }, "openai/deepseek-v3.2-exp" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/deepseek-v3.2-exp" }, "openai/glm-4.6" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/glm-4.6" }, "openai/DeepSeek-V3" => %{ api_base: gitee_api_base, api_key: gitee_api_key, model: "openai/DeepSeek-V3" }, "openai/QwQ-32B" => %{ api_base: "http://localhost:9069/samba/v1", api_key: "your token", model: "openai/QwQ-32B" }, "openai/gemini-2.5-pro" => %{ api_base: "http://127.0.0.1:9069/openai/v1", api_key: "your token", model: "openai/gemini-2.5-pro" }, "openai/Qwen/Qwen2.5-Coder-32B-Instruct" => %{ api_base: "https://api.siliconflow.cn/v1", api_key: "your token", model: "openai/Qwen/Qwen2.5-Coder-32B-Instruct" }, "openai/Qwen/Qwen2.5-32B-Instruct" => %{ api_base: "https://api.siliconflow.cn/v1", api_key: "your token", model: "openai/Qwen/Qwen2.5-32B-Instruct" }, "codestral/codestral-latest" => %{ api_base: "https://codestral.mistral.ai/v1", api_key: "your token", model: "mistral/codestral-latest" }, "anthropic/claude-3-5-sonnet-latest" => %{ api_base: "http://127.0.0.1:9069/anthropic/v1", api_key: "your token", model: "anthropic/claude-3-5-sonnet-latest" }, "mistral/mistral-small-latest" => %{ api_base: "https://api.mistral.ai/v1", api_key: "your token", model: "mistral/mistral-small-latest" }, "mistral/mistral-large-latest" => %{ api_base: "https://api.mistral.ai/v1", api_key: "your token", model: "mistral/mistral-large-latest" }, "groq/llama-3.3-70b-versatile" => %{ api_base: "http://127.0.0.1:9069/groq/v1", api_key: "your token", model: "openai/llama-3.3-70b-versatile" }, "gemini/gemini-2.0-flash" => %{ api_base: "http://127.0.0.1:9069/google/v1", api_key: "your token", model: "google/gemini-2.0-flash" }, "command-r-plus" => %{ api_base: "http://127.0.0.1:9069/cohere/v1", api_key: "your token", model: "openai/command-r-plus" }, "command-a-03-2025" => %{ api_base: "http://127.0.0.1:9069/cohere/v1", api_key: "your token", model: "openai/command-a-03-2025" } } config :exhub, gemini_api_base: "http://localhost:8765/v1", giteeai_api_key: gitee_api_key, openai_api_key: "your token", llms: llms_config, proxy: "http://127.0.0.1:7890", gitee_cat: %{ endpoint: "https://api.gitee.com/", auth: %{cookie: "your cookie"} # or %{access_token: "your acccess token"} }
-
Build:
MIX_ENV=prod mix release
-
Run the Server:
_build/prod/rel/exhub/bin/exhub start
- Install the Emacs Package:
Add the following to your Emacs configuration file (e.g.,
~/.emacs.d/init.el):(add-to-list 'load-path (expand-file-name "site-lisp/exhub" user-emacs-directory)) (require 'exhub) (exhub-start-elixir) (exhub-start)
Use the exhub-send function to send messages to the Elixir server:
(exhub-send "your message here")The exhub-tool package provides integration with MCP Tools using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-tool)exhub-start-git-mcp-server: Start the Git MCP server (requiredpip install mcp_server_gitat first).exhub-start-file-mcp-server: Start the File MCP server (requirednpxat first).exhub-start-k8s-mcp-server: Start the K8s MCP server (requirednpxandkubectlat first).exhub-start-gitee-mcp-server: Start the Gitee MCP server (requiredmcp-giteeat first).exhub-start-github-mcp-server: Start the GitHub MCP server (requirednpxat first).exhub-start-gitee-mcp-ent-server: Start the Gitee Enterprise MCP server (requiredmcp-gitee-entat first).
exhub-stop-git-mcp-server: Stop the Git MCP server.exhub-stop-file-mcp-server: Stop the File MCP server.exhub-stop-gitee-mcp-server: Stop the Gitee MCP server.exhub-stop-github-mcp-server: Stop the GitHub MCP server.exhub-stop-k8s-mcp-server: Stop the Kubernetes MCP server.exhub-stop-gitee-mcp-ent-server: Stop the Gitee Enterprise MCP server.exhub-stop-all-mcp-servers: Stop all MCP servers.
exhub-chat-with-git: Chat with Exhub using a registered Git server.exhub-chat-with-tools: Chat with Exhub using all tools.
The exhub-config package provides configuration management for Exhub.
;; Built-in extension of exhub package
(require 'exhub)exhub-switch-model: Switch the model by calling the Exhub configuration to list available models and prompt the user to select one.
The exhub-gitee package provides Gitee integration for Emacs using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-gitee)gitee-open-issues-buffer: Open a new Org-mode buffer to display Gitee issues.gitee-open-issue-detail-buffer: Open a new Org-mode buffer to display a Gitee issue detail.gitee-open-pulls-buffer: Open a new Org-mode buffer to display Gitee pulls.## exhub-chat
The exhub-chat package provides chat functionality for Emacs using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-chat)exhub-chat: Start a chat session with Exhub.exhub-chat-with-temp-buffer: Start a chat session with Exhub in a new temporary buffer.exhub-chat-with-multiline: Start a chat session with Exhub using a multiline input buffer.exhub-chat-with-multiline-with-temp-buffer: Start a chat session with Exhub using a new temporary buffer.exhub-chat-optimize-prompts: Optimize prompts using Exhub.
exhub-chat-generate-code: Generate code using Exhub.exhub-chat-adjust-code: Adjust code using Exhub.exhub-chat-comment-code: Add comments to code using Exhub.exhub-chat-refactory-code: Refactor code using Exhub.exhub-chat-format-code: Format code using Exhub.
exhub-chat-polish-document: Polish and proofread a document using Exhub.exhub-chat-improve-document: Improve and correct grammar and spelling errors in a document using Exhub.
exhub-chat-explain-code: Explain code using Exhub.
exhub-chat-generate-commit-message: Generate a commit message using Exhub.exhub-chat-generate-pull-desc: Generate a pull request description using Exhub.exhub-chat-generate-pull-review: Generate a pull request review using Exhub.
exhub-chat-translate-into-chinese: Translate text into Chinese using Exhub.exhub-chat-translate-into-english: Translate text into English using Exhub.
exhub-chat-choose-drafts: Choose from saved drafts using Exhub.
The exhub-translate package provides translation functionality for Emacs using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-translate)exhub-translate-insert: Insert translation based on the current mode.exhub-translate-insert-original-translation: Insert original translation.exhub-translate-insert-with-line: Insert translation with line style.exhub-translate-insert-with-underline: Insert translation with underline style.exhub-translate-insert-with-camel: Insert translation with camel case style.
exhub-translate-replace: Replace the current symbol with its English translation.exhub-translate-replace-with-line: Replace with line style.exhub-translate-replace-with-underline: Replace with underline style.exhub-translate-replace-with-camel: Replace with camel case style.exhub-translate-replace-zh: Translate and replace the selected region to Chinese.
exhub-translate-posframe: Show translation in a posframe.
The exhub-file package provides file operations for Emacs using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-file)exhub-preview-markdown: Preview the current buffer with markdown using Exhub.
The exhub-agent package provides agent integration for Emacs using Exhub.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-agent)exhub-agent-chat: Chat with existing or new agents in the Exhub world.exhub-agent-tool-reply: Reply to an existing agent using the output of a shell command.exhub-agent-init-tools: Initialize tools with existing or new agents in the Exhub world.exhub-agent-tool-call: Interact with tools using existing or new agents in the Exhub world.exhub-agent-kill: Kill an existing agent in the Exhub world.
When in exhub-agent-mode, the following keybindings are available:
C-c c:exhub-agent-chatC-c r:exhub-agent-tool-replyC-c i:exhub-agent-init-toolsC-c k:exhub-agent-killC-c t:exhub-agent-tool-call
The exhub-fim package provides LLM-powered code completion with dual modes: specialized prompts and various enhancements for chat-based LLMs on code completion tasks, and fill-in-the-middle (FIM) completion for compatible models.
Add the following to your Emacs configuration file (e.g., ~/.emacs.d/init.el):
(require 'exhub-fim)exhub-fim-show-suggestion: Show code suggestion using overlay at point.exhub-fim-next-suggestion: Cycle to next suggestion.exhub-fim-previous-suggestion: Cycle to previous suggestion.exhub-fim-accept-suggestion: Accept the current overlay suggestion.exhub-fim-dismiss-suggestion: Dismiss the current overlay suggestion.exhub-fim-accept-suggestion-line: Accept N lines of the current suggestion.exhub-fim-complete-with-minibuffer: Complete using minibuffer interface.
exhub-fim-auto-suggestion-mode: Toggle automatic code suggestions.
exhub-fim-configure-provider: Configure a exhub-fim provider interactively.
If you are running the Elixir proxy server locally (default port 9069), set the Gemini provider to use the proxy endpoint:
(setq exhub-fim-provider 'gemini)
;; The default :end-point in exhub-fim-gemini-options is already
;; "http://localhost:9069/google/v1/models", so no further change is needed.Ensure the environment variable GEMINI_API_KEY is exported in the shell that launches Emacs:
export GEMINI_API_KEY="your-gemini-key"- DRY Configuration Approach: Common API base and key values are now stored in variables for easier maintenance
- Enhanced LLM Support: Added support for new models including:
qwen3-next-80b-a3b-instructdeepseek-v3_1-terminusdeepseek-v3.2-expglm-4.6
- Improved LLM Config Server: Enhanced validation, error handling, and type specifications
- Better State Management: Robust configuration state handling with fallback mechanisms
- Type Safety: Added proper type specifications for better code reliability
- Extended Model Support: Updated router to support the new
qwen3-next-80b-a3b-instructmodel - Consistent API Mapping: Improved API base and key mapping for all supported models
Feel free to contribute to Exhub by opening issues or pull requests on the GitHub repository.
Exhub is licensed under the MIT License.