-
-
Notifications
You must be signed in to change notification settings - Fork 1k
Sd3 safetensors merge #1960
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sd3 safetensors merge #1960
Conversation
Fixed a bug that caused tensors to be converted to float32, and now saves as it is. Also, add @FurkanGozukara Thanks for reporting it. I will merge this PR if you confirm the fix. |
Just tested and works great FP16 half size :) please merge as soon as possible |
@kohya-ss can you merge this today so i can start new trainings with this feature and auto save with clip l and clip g while training? |
Thank you for testing, and sorry for the delay! |
thank you so what parameter do i need to give training to automatically merge during training? or you will add it separately? |
There is no automatic merging, so please run the script separately. |
can you please implement automatic merge? running this script is on each file extremely tiresome and impossible for many people |
@kohya-ss which file i would need to modify to achieve this? which file saves checkpoint and clip l and clip g during training so i can modify it to generate merged output and delete other outputs |
Currently all models are loaded into the main RAM before saving, so it uses the total size of .safetensor files main RAM.