-
Notifications
You must be signed in to change notification settings - Fork 19.3k
fix(openai): add support for reasoning content in OpenAI chat responses from OpenRouter #32982
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
fix(openai): add support for reasoning content in OpenAI chat responses from OpenRouter #32982
Conversation
Add reasoning_content to additional_kwargs when reasoning field is present in the response message. This enables support for reasoning content from providers like OpenRouter that include this field in their responses.
The latest updates on your projects. Learn more about Vercel for GitHub. |
CodSpeed Instrumentation Performance ReportMerging #32982 will not alter performanceComparing Summary
Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @sinanuozdemir, thanks for this.
You should be able to resolve your issue immediately by using ChatDeepSeek
(I see from the originating issue you have langchain-deepseek
already installed). That model basically subclasses BaseChatOpenAI and extends it with this feature. Let me know if that doesn't work for you.
There is a lot of confusion around this, and it's arguably weird to force use of ChatDeepSeek for some chat completions use cases, so I'm not opposed to going with this PR. If you want to add unit tests and extend to the streaming case, I think we could push it through.
Came across this PR while researching why using the
Output contains reasoning_content
@sinanuozdemir are you going to continue working on this PR? Would be great to make it functional for |
@ccurme I think I'm ready for another review! |
I came across this PR while running into the same issue. I had been using the OpenAI SDK directly for Claude via OpenRouter, since I couldn’t find documentation that ChatDeepSeek could be used in that context.
@ccurme I agree with your point about developer confusion — requiring ChatDeepSeek for some chat completion use cases isn’t intuitive, especially since OpenRouter's docs direct to ChatOpenAI for Langchain implementation. What do you think about the idea of inheriting from BaseChatOpenAI and creating a dedicated ChatOpenRouter wrapper? That might make the integration cleaner and reduce confusion. I’d be happy to contribute code for this (in TS and possibly Python) if that direction aligns with the project. |
Description: I added
reasoning_content
toadditional_kwargs
when the reasoning field is present in the response message. This enables support for reasoning content from providers like OpenRouter that include this field in their responses.Issue: #32981
Dependencies: N/A
P.S. this is my first time ever attempting to contribute to an open source package so feel free to rip me to shreds.