Skip to content

Lazy Import for Diffusers #4829

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 70 commits into from
Sep 11, 2023
Merged

Lazy Import for Diffusers #4829

merged 70 commits into from
Sep 11, 2023

Conversation

DN6
Copy link
Collaborator

@DN6 DN6 commented Aug 29, 2023

What does this PR do?

Adds lazy import functionality to Diffusers, similar to what exists in Transformers

Benchmark:

Initially tested import speed up by running time python -c "import diffusers" with all backends (torch, transformers, flax, onnxruntime) installed.

With Lazy Import

real    0m0.417s
user    0m0.714s
sys     0m0.499s

Without Lazy Import

real    0m5.391s
user    0m5.299s
sys     0m1.273s

This PR:

  1. Adds lazy import to the following modules
diffusers
models
pipelines
schedulers
  1. Moves objects defined in the __init__.py files of a module to their own dedicated files. e.g. StableDiffusionPipelineOutput is now in a pipeline_output.py file within the stable_diffusion module. This follows the same format as transformers and also keeps __init__.py reserved for import related code.

TODO:

  1. utils/check_dummies behaviour and the corresponding test is broken because of the new import structure. This is the last thing that needs to be addressed before merging.

Fixes # (issue)
#4260

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 requested a review from patrickvonplaten August 29, 2023 13:01
@patrickvonplaten
Copy link
Contributor

I'd actually do one big PR here and we can try to merge it quickly to not run into merge conflicts :-)

We should measure the speed-up of importing things once this is merged and also try to clean up many of these imports inside functions in the loaders.py file, e.g.:

from .models.attention_processor import (

@DN6
Copy link
Collaborator Author

DN6 commented Sep 6, 2023

@patrickvonplaten
I added type checking to the diffusers init and moved the accelerate import out of the utils init and changed the relevant files.

The import speed should be improved. Results on my machine with all backends + accelerate installed.

time python -c "import diffusers"

real    0m0.417s
user    0m0.714s
sys     0m0.499s

Could you verify you're seeing is similar numbers after the changes?

@patrickvonplaten
Copy link
Contributor

Ok I think I found one last package that slows down the import which is xformers: https://github.com/facebookresearch/xformers . I'd say most of our users have xformers installed - can you try to also make sure xformers is not important by default?

@DN6
Copy link
Collaborator Author

DN6 commented Sep 6, 2023

@patrickvonplaten changed it so that we use _torch_version = importlib_metadata.version("torch") instead of import torch; torch.__version__ when checking for xformers availability.

@patrickvonplaten
Copy link
Contributor

@DN6 could you take a look at the final failing tests or do you need help here?

@patrickvonplaten
Copy link
Contributor

Think we also need to integrate the changes from the just merged Wuerstchen PR

@DN6
Copy link
Collaborator Author

DN6 commented Sep 7, 2023

@patrickvonplaten
Fixed conflicts and added Wuerstchen.

Seeing some strange behaviour in the Fast tests for PRs / Fast PyTorch Models & Schedulers CPU tests.

When running

python -m pytest -n 2 --max-worker-restart=0 --dist=loadfile     -s -v -k "not Flax and not Onnx"  tests/models tests/schedulers

All tests pass

The failures occur when tests/others is also included. I isolated it to this test tests/others/test_check_copies.py

If you run this test with the model/scheduler test, you get a few failures in the scheduler tests.

FAILED tests/schedulers/test_schedulers.py::SchedulerBaseTests::test_save_load_compatible_schedulers - AssertionError: assert 'The config a...ation file.\n' == 'The config a...ation file.\n'
FAILED tests/schedulers/test_schedulers.py::SchedulerBaseTests::test_save_load_from_different_config - AssertionError: assert 'The config a...ation file.\n' == ''
FAILED tests/schedulers/test_schedulers.py::SchedulerBaseTests::test_save_load_from_different_config_comp_schedulers - assert 'The config a...ult values.\n' == "{'f'} was no...ult values.\n"

Any thoughts on what's happening? I can't seem to figure it out.

Comment on lines -32 to -40
# This is to make sure the diffusers module imported is the one in the repo.
spec = importlib.util.spec_from_file_location(
"diffusers",
os.path.join(DIFFUSERS_PATH, "__init__.py"),
submodule_search_locations=[DIFFUSERS_PATH],
)
diffusers_module = spec.loader.load_module()


Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@patrickvonplaten This section is causing the failing fast tests for models and schedulers. diffusers_module doesn't appear to be used anywhere. So I removed the snippet.

LMK if there's something I've missed here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That works!

@patrickvonplaten
Copy link
Contributor

Alright, great job - let's get this one in to avoid any more merge conflicts.
Over the next days, let's be aware of test failures that might be related to this cc @DN6

@patrickvonplaten patrickvonplaten merged commit b6e0b01 into main Sep 11, 2023
for name, value in _dummy_objects.items():
setattr(sys.modules[__name__], name, value)
for name, value in _additional_imports.items():
setattr(sys.modules[__name__], name, value)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We also need the "If TYPE_CHECKING" branch here

@Abhinay1997
Copy link
Contributor

Abhinay1997 commented Sep 12, 2023

@DN6, Looks like StableDiffusionGLIGENPipeline is added twice (See Line 238-240) in src/diffusers/__init__.py. I don't think it'll affect anything but just letting you know.

Abhinay1997 added a commit to Abhinay1997/diffusers that referenced this pull request Sep 12, 2023
@DN6 DN6 mentioned this pull request Sep 12, 2023
6 tasks
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* initial commit

* move modules to import struct

* add dummy objects and _LazyModule

* add lazy import to schedulers

* clean up unused imports

* lazy import on models module

* lazy import for schedulers module

* add lazy import to pipelines module

* lazy import altdiffusion

* lazy import audio diffusion

* lazy import audioldm

* lazy import consistency model

* lazy import controlnet

* lazy import dance diffusion ddim ddpm

* lazy import deepfloyd

* lazy import kandinksy

* lazy imports

* lazy import semantic diffusion

* lazy imports

* lazy import stable diffusion

* move sd output to its own module

* clean up

* lazy import t2iadapter

* lazy import unclip

* lazy import versatile and vq diffsuion

* lazy import vq diffusion

* helper to fetch objects from modules

* lazy import sdxl

* lazy import txt2vid

* lazy import stochastic karras

* fix model imports

* fix bug

* lazy import

* clean up

* clean up

* fixes for tests

* fixes for tests

* clean up

* remove import of torch_utils from utils module

* clean up

* clean up

* fix mistake import statement

* dedicated modules for exporting and loading

* remove testing utils from utils module

* fixes from  merge conflicts

* Update src/diffusers/pipelines/kandinsky2_2/__init__.py

* fix docs

* fix alt diffusion copied from

* fix check dummies

* fix more docs

* remove accelerate import from utils module

* add type checking

* make style

* fix check dummies

* remove torch import from xformers check

* clean up error message

* fixes after upstream merges

* dummy objects fix

* fix tests

* remove unused module import

---------

Co-authored-by: Patrick von Platen <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* initial commit

* move modules to import struct

* add dummy objects and _LazyModule

* add lazy import to schedulers

* clean up unused imports

* lazy import on models module

* lazy import for schedulers module

* add lazy import to pipelines module

* lazy import altdiffusion

* lazy import audio diffusion

* lazy import audioldm

* lazy import consistency model

* lazy import controlnet

* lazy import dance diffusion ddim ddpm

* lazy import deepfloyd

* lazy import kandinksy

* lazy imports

* lazy import semantic diffusion

* lazy imports

* lazy import stable diffusion

* move sd output to its own module

* clean up

* lazy import t2iadapter

* lazy import unclip

* lazy import versatile and vq diffsuion

* lazy import vq diffusion

* helper to fetch objects from modules

* lazy import sdxl

* lazy import txt2vid

* lazy import stochastic karras

* fix model imports

* fix bug

* lazy import

* clean up

* clean up

* fixes for tests

* fixes for tests

* clean up

* remove import of torch_utils from utils module

* clean up

* clean up

* fix mistake import statement

* dedicated modules for exporting and loading

* remove testing utils from utils module

* fixes from  merge conflicts

* Update src/diffusers/pipelines/kandinsky2_2/__init__.py

* fix docs

* fix alt diffusion copied from

* fix check dummies

* fix more docs

* remove accelerate import from utils module

* add type checking

* make style

* fix check dummies

* remove torch import from xformers check

* clean up error message

* fixes after upstream merges

* dummy objects fix

* fix tests

* remove unused module import

---------

Co-authored-by: Patrick von Platen <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants