Skip to content

generation_config default values have been modified to match model-specific defaults: {'temperature': 0.1}. If this is not desired, please set these values explicitly. #3892

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
xyxxmb opened this issue Apr 16, 2025 · 1 comment

Comments

@xyxxmb
Copy link

xyxxmb commented Apr 16, 2025

第一步:我先基于qwen-vl-7b,使用sft训练了一个lora模型,然后使用merge脚本将lora合并到基模型中,得到模型A。
第二步:基于swift infer推理脚本,我修改temperature,temperature=0和temperature=1的结果不同,符合预期。
第三步:我使用合并后的模型A,使用grpo训练了一个新lora模型,然后使用merge脚本将新lora合并到模型A中,得到模型B。
第四步:基于swift infer推理脚本,我修改temperature,temperature=0和temperature=1的结果完全一致,且在推理时有以下info输出:
generation_config default values have been modified to match model-specific defaults: {'temperature': 0.1}. If this is not desired, please set these values explicitly.
我理解可能temperature=0或者=1时没有生效,导致推理结果完全一样。这是什么原因?以及为什么会有这个info的警告信息?

@Jintao-Huang
Copy link
Collaborator

You can resolve this issue by downgrading the transformers version to 4.49 or upgrading ms-swift.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants