Skip to content

[feature] configurable llm #20

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 13 commits into
base: main
Choose a base branch
from
Draft

[feature] configurable llm #20

wants to merge 13 commits into from

Conversation

AlbertDeFusco
Copy link
Collaborator

@AlbertDeFusco AlbertDeFusco commented Jan 29, 2025

The LLM must be a subclass of langchain BaseChatModel. You will need to separately install the required langchain integration packages, only langchain-core is a required dependency.

To change the LLM use the ~/.anaconda/config.toml file

[plugin.assistant_conda.llm]
driver = "langchain_ollama.chat_models:ChatOllama"
params = { model = "llama2", temperature = 0.1 }
combine_messages = true

For the above llama2 model you will need to 1) have Ollama installed and ollama serve running and 2) install langchain-ollama

Some models do not work well with separate system and user messages. The default for combine_messages above is false. If you encounter issues with the response try setting it to true

must be a subclass of langchain BaseChatModel
@AlbertDeFusco AlbertDeFusco added enhancement New feature or request conda Issues related to anaconda-assistant-conda labels Jan 29, 2025
@AlbertDeFusco AlbertDeFusco marked this pull request as draft February 12, 2025 19:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
conda Issues related to anaconda-assistant-conda enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant