Skip to content

原始gte 7B 模型大小大概29G, 使用github,训练脚本使用example中对应的训练参数,改为全参训练,参数变成 14G。GTE模型全参训练完加载报错 #4005

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
zhanlun150729 opened this issue Apr 27, 2025 · 1 comment

Comments

@zhanlun150729
Copy link

zhanlun150729 commented Apr 27, 2025

Image Image Image
@zhanlun150729 zhanlun150729 changed the title 原始gte 7B 模型大小大概29G, 使用github训练参数,改为全参训练,参数变成 14G。GTE模型全参训练完加载报错 原始gte 7B 模型大小大概29G, 使用github,训练脚本使用example中对应的训练参数,改为全参训练,参数变成 14G。GTE模型全参训练完加载报错 Apr 27, 2025
@MengZizheng
Copy link

MengZizheng commented May 8, 2025

我也遇到了这个问题,输出到参数不全

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants