Skip to content

Add support for max_completion_tokens in AzureOpenAI chat options request and update the test and document. #3305

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

iAMSagar44
Copy link
Contributor

The maxTokens value is now deprecated in favor of max_completion_tokens, and is not compatible with o1 series models.

Reference - https://learn.microsoft.com/en-us/java/api/com.azure.ai.openai.models.chatcompletionsoptions?view=azure-java-preview#com-azure-ai-openai-models-chatcompletionsoptions-setmaxtokens(java-lang-integer)

…uest

  An upper bound for the number of tokens that can be generated for a completion,
  including visible output tokens and reasoning tokens.
  Replaces max_tokens field which is now deprecated.

Signed-off-by: Sagar <sagar.jobs@outlook.com>
Update content to add spring.ai.openai.chat.options.maxCompletionTokens property

Signed-off-by: Sagar <sagar.jobs@outlook.com>
@iAMSagar44 iAMSagar44 force-pushed the azure-openai-chatoptions-maxCompletionToken branch from a16565f to 87ded10 Compare May 24, 2025 02:38
@markpollack
Copy link
Member

Hi, thanks for the PR. We can't introduce a breaking change in our public API (e.g. by removing getMaxTokens() and similar accessors.) I think this should be handled in a more subtle way, keeping the current API but under the covers passing the right value into the azure sdk.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants