Skip to content

[Feat] Support SDXL Kohya-style LoRA #4287

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 99 commits into from
Jul 28, 2023
Merged
Changes from 1 commit
Commits
Show all changes
99 commits
Select commit Hold shift + click to select a range
9ae25cf
sdxl lora changes.
sayakpaul Jul 26, 2023
37f5598
better name replacement.
sayakpaul Jul 26, 2023
be43f12
better replacement.
sayakpaul Jul 26, 2023
431a7ef
debugging
sayakpaul Jul 26, 2023
177c4fe
debugging
sayakpaul Jul 26, 2023
d4e1c22
debugging
sayakpaul Jul 26, 2023
ed74dfe
debugging
sayakpaul Jul 26, 2023
3667888
debugging
sayakpaul Jul 26, 2023
40cb790
remove print.
sayakpaul Jul 26, 2023
cbc21f7
print state dict keys.
sayakpaul Jul 26, 2023
e12b128
print
sayakpaul Jul 26, 2023
4ea13cf
distingisuih better
sayakpaul Jul 26, 2023
e0a946b
debuggable.
sayakpaul Jul 26, 2023
508bd78
fxi: tyests
sayakpaul Jul 26, 2023
b6f0ba0
fix: arg from training script.
sayakpaul Jul 26, 2023
ca35339
access from class.
sayakpaul Jul 26, 2023
6b535bc
run style
sayakpaul Jul 26, 2023
3e72de3
debug
patrickvonplaten Jul 26, 2023
0ce0544
save intermediate
patrickvonplaten Jul 26, 2023
456eba5
some simplifications for SDXL LoRA
patrickvonplaten Jul 26, 2023
d1efaac
styling
sayakpaul Jul 27, 2023
618aa9e
unet config is not needed in diffusers format.
sayakpaul Jul 27, 2023
46f2809
fix: dynamic SGM block mapping for SDXL kohya loras (#4322)
isidentical Jul 27, 2023
a18cb4d
Use lora compatible layers for linear proj_in/proj_out (#4323)
isidentical Jul 27, 2023
673a674
improve condition for using the sgm_diffusers mapping
sayakpaul Jul 27, 2023
eb4f8ce
informative comment.
sayakpaul Jul 27, 2023
205bb73
load compatible keys and embedding layer maaping.
sayakpaul Jul 27, 2023
16632cd
Get SDXL 1.0 example lora to load
patrickvonplaten Jul 28, 2023
24cf8bd
simplify
sayakpaul Jul 28, 2023
935839d
specif ranks and hidden sizes.
sayakpaul Jul 28, 2023
cd581c2
better handling of k rank and hidden
sayakpaul Jul 28, 2023
d7c9be1
debug
sayakpaul Jul 28, 2023
b614d6e
debug
sayakpaul Jul 28, 2023
0b8c47f
debug
sayakpaul Jul 28, 2023
190dfad
debug
sayakpaul Jul 28, 2023
925327c
debug
sayakpaul Jul 28, 2023
37215a9
fix: alpha keys
sayakpaul Jul 28, 2023
e034ff9
add check for handling LoRAAttnAddedKVProcessor
sayakpaul Jul 28, 2023
e424968
sanity comment
sayakpaul Jul 28, 2023
4d1295f
modifications for text encoder SDXL
sayakpaul Jul 28, 2023
619e01c
debugging
sayakpaul Jul 28, 2023
37c3a94
debugging
sayakpaul Jul 28, 2023
be066bc
debugging
sayakpaul Jul 28, 2023
b67a637
debugging
sayakpaul Jul 28, 2023
157b2cc
debugging
sayakpaul Jul 28, 2023
e500ccf
debugging
sayakpaul Jul 28, 2023
e4d9649
debugging
sayakpaul Jul 28, 2023
02d445e
debugging
sayakpaul Jul 28, 2023
1c66798
denugging
sayakpaul Jul 28, 2023
d372a7d
debugging
sayakpaul Jul 28, 2023
9ab6e7a
debugging
sayakpaul Jul 28, 2023
2e7c60d
debugging
sayakpaul Jul 28, 2023
fcd94eb
debugging
sayakpaul Jul 28, 2023
9f65a8b
debugging
sayakpaul Jul 28, 2023
837fce5
debugging
sayakpaul Jul 28, 2023
4a3ac8e
debugging
sayakpaul Jul 28, 2023
7852599
debugging
sayakpaul Jul 28, 2023
7ac4526
debugging
sayakpaul Jul 28, 2023
163fe5e
debugging
sayakpaul Jul 28, 2023
6f2507a
debugging
sayakpaul Jul 28, 2023
ddf56e6
debugging
sayakpaul Jul 28, 2023
ddf11a1
debugging
sayakpaul Jul 28, 2023
f5d24a2
debugging
sayakpaul Jul 28, 2023
8032778
debugging
sayakpaul Jul 28, 2023
3e376b2
debugging
sayakpaul Jul 28, 2023
48c44bc
debugging
sayakpaul Jul 28, 2023
db2eea0
debugging
sayakpaul Jul 28, 2023
4125c4d
up
sayakpaul Jul 28, 2023
94bbe5e
up
sayakpaul Jul 28, 2023
260c903
up
sayakpaul Jul 28, 2023
31a4104
up
sayakpaul Jul 28, 2023
fbf1c73
up
sayakpaul Jul 28, 2023
d676e61
up
sayakpaul Jul 28, 2023
6a6654b
unneeded comments.
sayakpaul Jul 28, 2023
db0e8d0
unneeded comments.
sayakpaul Jul 28, 2023
43a3955
kwargs for the other attention processors.
sayakpaul Jul 28, 2023
bb55eaa
kwargs for the other attention processors.
sayakpaul Jul 28, 2023
2bbd5c3
debugging
sayakpaul Jul 28, 2023
68edb87
debugging
sayakpaul Jul 28, 2023
451fee1
debugging
sayakpaul Jul 28, 2023
e706f14
debugging
sayakpaul Jul 28, 2023
edf9ea9
improve
patrickvonplaten Jul 28, 2023
07ae365
Merge branch 'feat/sdxl-lora-1' of https://github.com/huggingface/dif…
patrickvonplaten Jul 28, 2023
2d77481
debugging
sayakpaul Jul 28, 2023
a50c7ee
debugging
sayakpaul Jul 28, 2023
281a042
more print
patrickvonplaten Jul 28, 2023
8904959
Merge branch 'feat/sdxl-lora-1' of https://github.com/huggingface/dif…
patrickvonplaten Jul 28, 2023
00f391c
Fix alphas
patrickvonplaten Jul 28, 2023
c53277a
debugging
sayakpaul Jul 28, 2023
adafaf2
debugging
sayakpaul Jul 28, 2023
fd4dd0e
debugging
sayakpaul Jul 28, 2023
0519e4a
debugging
sayakpaul Jul 28, 2023
6c08115
debugging
sayakpaul Jul 28, 2023
8ae3798
debugging
sayakpaul Jul 28, 2023
bf7fe22
Merge branch 'main' into feat/sdxl-lora-1
patrickvonplaten Jul 28, 2023
baf386d
clean up
sayakpaul Jul 28, 2023
060f9f7
clean up.
sayakpaul Jul 28, 2023
98cada2
debugging
sayakpaul Jul 28, 2023
cc4a019
fix: text
sayakpaul Jul 28, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
distingisuih better
  • Loading branch information
sayakpaul committed Jul 26, 2023
commit 4ea13cf7a5244e6f379f17b6700b2df7c0b1565a
20 changes: 14 additions & 6 deletions src/diffusers/loaders.py
Original file line number Diff line number Diff line change
Expand Up @@ -1371,12 +1371,20 @@ def _convert_kohya_lora_to_diffusers(cls, state_dict):

if lora_name.startswith("lora_unet_"):
diffusers_name = key.replace("lora_unet_", "").replace("_", ".")
diffusers_name = diffusers_name.replace("down.blocks", "down_blocks")
diffusers_name = diffusers_name.replace("input.blocks", "down_blocks")
diffusers_name = diffusers_name.replace("mid.block", "mid_block")
diffusers_name = diffusers_name.replace("middle.block", "mid_block")
diffusers_name = diffusers_name.replace("up.blocks", "up_blocks")
diffusers_name = diffusers_name.replace("output.blocks", "up_blocks")
if "input.blocks" in diffusers_name:
diffusers_name = diffusers_name.replace("input.blocks", "down_blocks")
else:
diffusers_name = diffusers_name.replace("down.blocks", "down_blocks")

if "middle.block" in diffusers_name:
diffusers_name = diffusers_name.replace("middle.block", "mid_block")
else:
diffusers_name = diffusers_name.replace("mid.block", "mid_block")
if "output.blocks" in diffusers_name:
diffusers_name = diffusers_name.replace("output.blocks", "up_blocks")
else:
diffusers_name = diffusers_name.replace("up.blocks", "up_blocks")

diffusers_name = diffusers_name.replace("transformer.blocks", "transformer_blocks")
diffusers_name = diffusers_name.replace("to.q.lora", "to_q_lora")
diffusers_name = diffusers_name.replace("to.k.lora", "to_k_lora")
Expand Down