Skip to content

PPO报错:AttributeError: 'Qwen2ForCausalLM' object has no attribute 'zero_gather_16bit_weights_on_model_save' #3937

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
qq941134965 opened this issue Apr 19, 2025 · 3 comments

Comments

@qq941134965
Copy link

训练脚本:
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7
NPROC_PER_NODE=8
swift rlhf
--rlhf_type ppo
--model ${model_path}
--model_type qwen2_5
--reward_model ${reward_model_path}
--train_type ${train_type}
--dataset ${dataset}
--torch_dtype bfloat16
--num_train_epochs 2
--per_device_train_batch_size 1
--per_device_eval_batch_size 1
--learning_rate 1e-6
--gradient_accumulation_steps 2
--eval_steps 100
--save_steps 100
--logging_steps 5
--max_length 4096
--output_dir ${output_dir}
--warmup_ratio 0.05
--dataloader_num_workers 4
--temperature 0.7
--top_p 0.9
--deepspeed zero3 \

deepspeed使用zero2的时候显示显存不够,使用zero3出现此问题

@Jintao-Huang
Copy link
Collaborator

swift版本是什么

@qq941134965
Copy link
Author

ms-swift版本为3.2.2

@qq941134965
Copy link
Author

swift版本是什么

ms-swift版本为3.2.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants