Skip to content

[LoRA] parse metadata from LoRA and save metadata #11324

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 84 commits into from
Jun 13, 2025
Merged
Changes from 1 commit
Commits
Show all changes
84 commits
Select commit Hold shift + click to select a range
5139de1
feat: parse metadata from lora state dicts.
sayakpaul Apr 15, 2025
d8a305e
tests
sayakpaul Apr 15, 2025
ba546bc
fix tests
sayakpaul Apr 15, 2025
25f826e
Merge branch 'main' into metadata-lora
sayakpaul Apr 15, 2025
61d3708
key renaming
sayakpaul Apr 15, 2025
e98fb84
fix
sayakpaul Apr 15, 2025
2f1c326
Merge branch 'main' into metadata-lora
sayakpaul Apr 15, 2025
d390d4d
Merge branch 'main' into metadata-lora
sayakpaul Apr 16, 2025
201bd7b
resolve conflicts.
sayakpaul Apr 21, 2025
a771982
Merge branch 'main' into metadata-lora
sayakpaul May 2, 2025
42bb6bc
smol update
sayakpaul May 2, 2025
7ec4ef4
smol updates
sayakpaul May 2, 2025
7f59ca0
load metadata.
sayakpaul May 2, 2025
ded2fd6
automatically save metadata in save_lora_adapter.
sayakpaul May 2, 2025
d5b3037
propagate changes.
sayakpaul May 2, 2025
bee9e00
changes
sayakpaul May 2, 2025
a9f5088
add test to models too.
sayakpaul May 2, 2025
7716303
tigher tests.
sayakpaul May 2, 2025
0ac1a39
updates
sayakpaul May 2, 2025
4b51bbf
fixes
sayakpaul May 2, 2025
e2ca95a
rename tests.
sayakpaul May 2, 2025
7a2ba69
Merge branch 'main' into metadata-lora
sayakpaul May 3, 2025
e0449c2
sorted.
sayakpaul May 3, 2025
918aef1
Update src/diffusers/loaders/lora_base.py
sayakpaul May 3, 2025
4bd325c
review suggestions.
sayakpaul May 3, 2025
e8bec86
removeprefix.
sayakpaul May 5, 2025
aa5cb3c
Merge branch 'main' into metadata-lora
sayakpaul May 5, 2025
7bb6c9f
propagate changes.
sayakpaul May 8, 2025
116306e
fix-copies
sayakpaul May 8, 2025
ae0580a
sd
sayakpaul May 8, 2025
f6fde6f
docs.
sayakpaul May 8, 2025
cbb4071
resolve conflicts.
sayakpaul May 8, 2025
87417b2
fixes
sayakpaul May 8, 2025
55a41bf
Merge branch 'main' into metadata-lora
sayakpaul May 9, 2025
16dba2d
get review ready.
sayakpaul May 9, 2025
023c0fe
Merge branch 'main' into metadata-lora
sayakpaul May 9, 2025
67bceda
one more test to catch error.
sayakpaul May 9, 2025
83a8995
merge conflicts.
sayakpaul May 9, 2025
d336486
Merge branch 'main' into metadata-lora
sayakpaul May 11, 2025
4f2d90c
Merge branch 'main' into metadata-lora
sayakpaul May 12, 2025
42a0d1c
Merge branch 'main' into metadata-lora
sayakpaul May 15, 2025
9c32dc2
Merge branch 'main' into metadata-lora
linoytsaban May 18, 2025
5d578c9
Merge branch 'main' into metadata-lora
sayakpaul May 19, 2025
1c37845
Merge branch 'main' into metadata-lora
linoytsaban May 20, 2025
2bf7fde
Merge branch 'main' into metadata-lora
sayakpaul May 21, 2025
4304a6d
change to a different approach.
sayakpaul May 22, 2025
425ea95
fix-copies.
sayakpaul May 22, 2025
e08830e
todo
sayakpaul May 22, 2025
40f5c97
sd3
sayakpaul May 22, 2025
5a2a023
update
sayakpaul May 22, 2025
0ae3408
revert changes in get_peft_kwargs.
sayakpaul May 22, 2025
99fe09c
update
sayakpaul May 22, 2025
f4d4179
fixes
sayakpaul May 22, 2025
46f4726
fixes
sayakpaul May 22, 2025
c4bd1c7
Merge branch 'main' into metadata-lora
sayakpaul May 22, 2025
1348463
simplify _load_sft_state_dict_metadata
sayakpaul May 22, 2025
ef16bce
update
sayakpaul May 22, 2025
9cba78e
style fix
sayakpaul May 22, 2025
28d634f
uipdate
sayakpaul May 22, 2025
e07ace0
update
sayakpaul May 22, 2025
c762b7c
update
sayakpaul May 22, 2025
c8c33d3
empty commit
sayakpaul May 22, 2025
aabfb5f
resolve conflicts.
sayakpaul May 22, 2025
a4f78c8
Merge branch 'main' into metadata-lora
sayakpaul May 27, 2025
e3e8b20
Merge branch 'main' into metadata-lora
sayakpaul Jun 2, 2025
72b489d
resolve conflicts.
sayakpaul Jun 4, 2025
d952267
_pack_dict_with_prefix
sayakpaul Jun 5, 2025
9bbc6dc
update
sayakpaul Jun 5, 2025
eb52469
TODO 1.
sayakpaul Jun 5, 2025
461d2bd
todo: 2.
sayakpaul Jun 5, 2025
f78c6f9
todo: 3.
sayakpaul Jun 5, 2025
0eba7e7
update
sayakpaul Jun 5, 2025
a4a15b5
update
sayakpaul Jun 5, 2025
4588d83
Merge branch 'main' into metadata-lora
sayakpaul Jun 5, 2025
47cad58
Merge branch 'main' into metadata-lora
sayakpaul Jun 6, 2025
252fd21
Apply suggestions from code review
sayakpaul Jun 6, 2025
29ff6f1
reraise.
sayakpaul Jun 6, 2025
0007969
Merge branch 'main' into metadata-lora
sayakpaul Jun 6, 2025
cbc01a3
Merge branch 'main' into metadata-lora
sayakpaul Jun 9, 2025
2cb9e46
Merge branch 'main' into metadata-lora
sayakpaul Jun 10, 2025
c0d5156
Merge branch 'main' into metadata-lora
sayakpaul Jun 12, 2025
603462e
Merge branch 'main' into metadata-lora
sayakpaul Jun 13, 2025
37a225a
move argument.
sayakpaul Jun 13, 2025
1c03709
Merge branch 'main' into metadata-lora
sayakpaul Jun 13, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
sd
  • Loading branch information
sayakpaul committed May 8, 2025
commit ae0580a548cdc7d8c8eeed26cbc84041a05be501
12 changes: 12 additions & 0 deletions src/diffusers/loaders/lora_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -457,6 +457,8 @@ def save_lora_weights(
weight_name: str = None,
save_function: Callable = None,
safe_serialization: bool = True,
unet_lora_adapter_metadata=None,
text_encoder_lora_adapter_metadata=None,
):
r"""
Save the LoRA parameters corresponding to the UNet and text encoder.
Expand All @@ -479,8 +481,11 @@ def save_lora_weights(
`DIFFUSERS_SAVE_MODE`.
safe_serialization (`bool`, *optional*, defaults to `True`):
Whether to save the model using `safetensors` or the traditional PyTorch way with `pickle`.
unet_lora_adapter_metadata: TODO
text_encoder_lora_adapter_metadata: TODO
"""
state_dict = {}
lora_adapter_metadata = {}

if not (unet_lora_layers or text_encoder_lora_layers):
raise ValueError("You must pass at least one of `unet_lora_layers` and `text_encoder_lora_layers`.")
Expand All @@ -491,6 +496,12 @@ def save_lora_weights(
if text_encoder_lora_layers:
state_dict.update(cls.pack_weights(text_encoder_lora_layers, cls.text_encoder_name))

if unet_lora_adapter_metadata is not None:
lora_adapter_metadata.update(cls.pack_weights(unet_lora_adapter_metadata, cls.unet_name))

if text_encoder_lora_adapter_metadata:
lora_adapter_metadata.update(cls.pack_weights(text_encoder_lora_adapter_metadata, cls.text_encoder_name))

# Save the model
cls.write_lora_layers(
state_dict=state_dict,
Expand All @@ -499,6 +510,7 @@ def save_lora_weights(
weight_name=weight_name,
save_function=save_function,
safe_serialization=safe_serialization,
lora_adapter_metadata=lora_adapter_metadata,
)

def fuse_lora(
Expand Down