Skip to content

configuring defaultOptions can cause defaultToolCallbacks failure #3392

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jimmyqin opened this issue May 30, 2025 · 3 comments
Open

configuring defaultOptions can cause defaultToolCallbacks failure #3392

jimmyqin opened this issue May 30, 2025 · 3 comments

Comments

@jimmyqin
Copy link

Bug description
When building defaultOptions for ChatClient, the defaultToolCallbacks will fail, When calling AI chat, it will not trigger tool calls. Deleting the defaultOptions configuration may trigger tool calls when conversing with AI again

Environment
Spring AI 1.0.0, Springboot 3.5.0, Jdk21

Steps to reproduce

# ai
spring.ai.deepseek.api-key=xxxxxxxxxxx
package com.demo.ai.config;

import lombok.RequiredArgsConstructor;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.client.advisor.MessageChatMemoryAdvisor;
import org.springframework.ai.chat.client.advisor.SimpleLoggerAdvisor;
import org.springframework.ai.chat.memory.ChatMemoryRepository;
import org.springframework.ai.chat.memory.MessageWindowChatMemory;
import org.springframework.ai.chat.prompt.ChatOptions;
import org.springframework.ai.deepseek.api.DeepSeekApi;
import org.springframework.ai.tool.ToolCallbackProvider;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.Resource;

@RequiredArgsConstructor
@Configuration
public class AiConfig {
    @Value("classpath:/prompts/system-message.st")
    private Resource systemResource;
    private final ChatMemoryRepository chatMemoryRepository;

    @Bean("chatClient")
    public ChatClient chatClient(ChatClient.Builder chatClientBuilder, ToolCallbackProvider[] toolCallbackProviders) {
        MessageWindowChatMemory chatMemory = MessageWindowChatMemory.builder()
                .chatMemoryRepository(chatMemoryRepository)
                .maxMessages(50)
                .build();
        return chatClientBuilder
                .defaultOptions(ChatOptions.builder()
                        .model(DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue())
                        .temperature(0.7)
                        .build())
                .defaultAdvisors(MessageChatMemoryAdvisor.builder(chatMemory).build(), new SimpleLoggerAdvisor())
                .defaultSystem(systemResource)
                .defaultToolCallbacks(toolCallbackProviders)
                .build();
    }

    @Bean("thinkChatClient")
    public ChatClient thinkChatClient(ChatClient.Builder chatClientBuilder, ToolCallbackProvider[] toolCallbackProviders) {
        MessageWindowChatMemory chatMemory = MessageWindowChatMemory.builder()
                .chatMemoryRepository(chatMemoryRepository)
                .maxMessages(50)
                .build();
        return chatClientBuilder
                .defaultOptions(ChatOptions.builder()
                        .model(DeepSeekApi.ChatModel.DEEPSEEK_REASONER.getValue())
                        .temperature(0.6)
                        .build())
                .defaultAdvisors(MessageChatMemoryAdvisor.builder(chatMemory).build(), new SimpleLoggerAdvisor())
                .defaultSystem(systemResource)
                .defaultToolCallbacks(toolCallbackProviders)
                .build();
    }
}
package com.demo.ai.service;

import com.demo.base.excpetion.ServiceException;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.memory.ChatMemory;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.stereotype.Service;
import org.springframework.web.servlet.mvc.method.annotation.SseEmitter;

import java.io.IOException;

@Service
public class ChatService {
    private final ChatClient chatClient;
    private final ChatClient thinkChatClient;

    public ChatService(@Qualifier("chatClient") ChatClient chatClient,
                       @Qualifier("thinkChatClient") ChatClient thinkChatClient) {
        this.chatClient = chatClient;
        this.thinkChatClient = thinkChatClient;
    }

    public SseEmitter chat(String question, String conversationId, Boolean isThinking) {
        ChatClient client = isThinking ? thinkChatClient : chatClient;
        return chat(question, conversationId, client);
    }

    private SseEmitter chat(String question, String conversationId, ChatClient client) {
        SseEmitter emitter = new SseEmitter();
        client.prompt()
                .user(question)
                .advisors(a -> a.param(ChatMemory.CONVERSATION_ID, conversationId))
                .stream()
                .chatResponse()
                .doOnNext(message -> {
                    try {
                        emitter.send(message.getResult());
                    } catch (IOException e) {
                        emitter.completeWithError(e);
                        throw new ServiceException("send error");
                    }
                })
                .doOnComplete(emitter::complete)
                .doOnError(emitter::completeWithError)
                .subscribe();
        return emitter;
    }

}

Expected behavior
I want to configure two models and switch between different chatClient by considering whether to switch front-end or not

Minimal Complete Reproducible example
Please provide a failing test or a minimal complete verifiable example that reproduces the issue.
Bug reports that are reproducible will take priority in resolution over reports that are not reproducible.

@markpollack
Copy link
Member

It is not clear to me what is failing from the code you provided. Can you please clarify?

@jimmyqin
Copy link
Author

@markpollack If I configure the default options, like this

                .defaultOptions(ChatOptions.builder()
                        .model(DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue())
                        .temperature(0.7)
                        .build())

my tool will not work, If I Delete the default options, tool is work~

@Service
public class ToolService {

    @Tool(description = "query weather")
    public String queryWeather(@ToolParam(description = "Weather date") String date) {
        System.out.println("weather date is: " + date);
        return date + " The weather is sunny";
    }
}

@Configuration
public class ToolConfig {

    @Bean
    public ToolCallbackProvider toolServiceMethodToolCallback(ToolService toolService) {
        return MethodToolCallbackProvider.builder().toolObjects(toolService).build();
    }
}

@lambochen
Copy link
Contributor

lambochen commented May 31, 2025

@jimmyqin @markpollack

Hi, I've encountered the same issue, it might be the same reason, can you confirm?
This issue occurs because ChatOptions.builder() creates a DefaultChatOptions instead of options that support Tool Calling, so when passing parameters to the LLM, the Tools list is discarded.

Solution: Use other ChatOptions that support Tool Calling,

ToolCallingChatOptions.builder().model(model).build()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants