Closed
Description
Self Checks
- I have searched for existing issues search for existing issues, including closed ones.
- I confirm that I am using English to submit this report (Language Policy).
- Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- Please do not modify this template :) and fill in all the required fields.
RAGFlow workspace code commit ID
RAGFlow image version
0.18
Other environment information
Actual behavior
When I update the model selection through the Update chat assistant interface: PUT /api/v1/chats/{chat_id}, and provide llm.model_name=deepseek-r1:32b to the interface, it can be successfully saved. However, the assistant session cannot call the model because the @ollama field is not added to my model in the above code, resulting in llm_id=deepseek-r1:32b. If I want llm_id to be saved correctly, I need to set llm.model_name=deepseek-r1:32b@Ollama, but this makes it impossible for me to verify through the source code in the above screenshot.
Expected behavior
The llm.model_name=deepseek-r1:32b I submitted should be processed by adding the @ suffix, or when verifying the query, the @ suffix should be automatically truncated for the query.
Steps to reproduce
Submit a modification to the chat assistant through the "Update chat assistant" method by sending a PUT request to /api/v1/chats/{chat_id}, and update and submit the value of llm.model_name
Additional information
No response