Skip to content

Conversation

@pacman100
Copy link
Contributor

What does this PR do?

  1. support multiple rank/alpha values
  2. support multiple active adapters
  3. support disabling and enabling adapters

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with Patrick's and Benjamin's comments. Mine are mostly clarification questions and nits.

lora_scale=self.lora_scale,
low_cpu_mem_usage=low_cpu_mem_usage,
_pipeline=self,
adapter_name=adapter_name,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's keep the private variable at the end.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very clean to me! Thanks so much for iterating.

I see only "text_encoder" attribute being used in the PR. Are we not tackling "text_encoder_2" here?

I didn't see any tests, though. Will they be added in a follow-up?

@younesbelkada
Copy link
Contributor

For the tests, I am happy to add them here: #5151 to complete the full testing suite

See my comment: #5151 (comment)

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 27, 2023

The documentation is not available anymore as the PR was closed or merged.

@patrickvonplaten
Copy link
Contributor

For the tests, I am happy to add them here: #5151 to complete the full testing suite

See my comment: #5151 (comment)

Ok for me to add tests in a later PR

@patrickvonplaten
Copy link
Contributor

Very nice job @pacman100 - think the design is very nice & also agree that higher level enable/disable functions are very useful for the users (to quickly switch on/off all LoRAs)

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok for me to merge once CI is green (ok if LoRA peft tests are still failing)

@patrickvonplaten patrickvonplaten merged commit 02247d9 into main Sep 27, 2023
@kashif kashif deleted the smangrul/peft-integration branch September 29, 2023 11:38
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
…sable/enable adapters and support for multiple adapters (huggingface#5147)

* more fixes

* up

* up

* style

* add in setup

* oops

* more changes

* v1 rzfactor CI

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <[email protected]>

* few todos

* protect torch import

* style

* fix fuse text encoder

* Update src/diffusers/loaders.py

Co-authored-by: Sayak Paul <[email protected]>

* replace with `recurse_replace_peft_layers`

* keep old modules for BC

* adjustments on `adjust_lora_scale_text_encoder`

* nit

* move tests

* add conversion utils

* remove unneeded methods

* use class method instead

* oops

* use `base_version`

* fix examples

* fix CI

* fix weird error with python 3.8

* fix

* better fix

* style

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <[email protected]>

* add comment

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>

* conv2d support for recurse remove

* added docstrings

* more docstring

* add deprecate

* revert

* try to fix merge conflicts

* peft integration features for text encoder

1. support multiple rank/alpha values
2. support multiple active adapters
3. support disabling and enabling adapters

* fix bug

* fix code quality

* Apply suggestions from code review

Co-authored-by: Younes Belkada <[email protected]>

* fix bugs

* Apply suggestions from code review

Co-authored-by: Younes Belkada <[email protected]>

* address comments

Co-Authored-By: Benjamin Bossan <[email protected]>
Co-Authored-By: Patrick von Platen <[email protected]>

* fix code quality

* address comments

* address comments

* Apply suggestions from code review

* find and replace

---------

Co-authored-by: younesbelkada <[email protected]>
Co-authored-by: Younes Belkada <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
Co-authored-by: Sayak Paul <[email protected]>
Co-authored-by: Benjamin Bossan <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
…sable/enable adapters and support for multiple adapters (huggingface#5147)

* more fixes

* up

* up

* style

* add in setup

* oops

* more changes

* v1 rzfactor CI

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <[email protected]>

* few todos

* protect torch import

* style

* fix fuse text encoder

* Update src/diffusers/loaders.py

Co-authored-by: Sayak Paul <[email protected]>

* replace with `recurse_replace_peft_layers`

* keep old modules for BC

* adjustments on `adjust_lora_scale_text_encoder`

* nit

* move tests

* add conversion utils

* remove unneeded methods

* use class method instead

* oops

* use `base_version`

* fix examples

* fix CI

* fix weird error with python 3.8

* fix

* better fix

* style

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <[email protected]>

* add comment

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>

* conv2d support for recurse remove

* added docstrings

* more docstring

* add deprecate

* revert

* try to fix merge conflicts

* peft integration features for text encoder

1. support multiple rank/alpha values
2. support multiple active adapters
3. support disabling and enabling adapters

* fix bug

* fix code quality

* Apply suggestions from code review

Co-authored-by: Younes Belkada <[email protected]>

* fix bugs

* Apply suggestions from code review

Co-authored-by: Younes Belkada <[email protected]>

* address comments

Co-Authored-By: Benjamin Bossan <[email protected]>
Co-Authored-By: Patrick von Platen <[email protected]>

* fix code quality

* address comments

* address comments

* Apply suggestions from code review

* find and replace

---------

Co-authored-by: younesbelkada <[email protected]>
Co-authored-by: Younes Belkada <[email protected]>
Co-authored-by: Patrick von Platen <[email protected]>
Co-authored-by: Sayak Paul <[email protected]>
Co-authored-by: Benjamin Bossan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants