Skip to content

Complete set_attn_processor for prior and vae #3796

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Jun 15, 2023
Merged

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Jun 15, 2023

This adds cleans up the tests a bit and makes sure that:

  • UNet2DConditionModel
  • PriorTransformer
  • AutoencoderKL

all have the same features. This PR also cleans up the tests a bit and adds a whole test suite for the prior transformer.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 15, 2023

The documentation is not available anymore as the PR was closed or merged.

@patrickvonplaten patrickvonplaten changed the title Add prior tests Complete set_attn_processor for prior and vae Jun 15, 2023
@patrickvonplaten patrickvonplaten mentioned this pull request Jun 15, 2023
4 tasks
Copy link
Contributor

@kashif kashif left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i love the tests!


import torch
import torch.nn as nn

from ..configuration_utils import ConfigMixin, register_to_config
from ..utils import BaseOutput, apply_forward_hook
from .attention_processor import AttentionProcessor, AttnProcessor
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we maybe also implement this automatic discoverability of AttnProcessor2_0?

AttnProcessor2_0() if hasattr(F, "scaled_dot_product_attention") and self.scale_qk else AttnProcessor()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't fully follow here. Note that by default the ATTN 2.0 is always chosen whenever someone uses the Attention class which is done for both the AutoEncoder and the Prior, see:

@@ -182,8 +184,9 @@ def test_output_pretrained(self):
self.assertTrue(torch_all_close(output_slice, expected_output_slice, rtol=1e-3))


class NCSNppModelTests(ModelTesterMixin, unittest.TestCase):
class NCSNppModelTests(ModelTesterMixin, UNetTesterMixin, unittest.TestCase):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curiosity:

What does it mean by NCSNpp?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -153,8 +154,9 @@ def test_unet_1d_maestro(self):
assert (output_max - 0.0607).abs() < 4e-4


class UNetRLModelTests(ModelTesterMixin, unittest.TestCase):
class UNetRLModelTests(ModelTesterMixin, UNetTesterMixin, unittest.TestCase):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curiosity?

What is it called UNetRLModelTests?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests a RL model we've integrated

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exceptionally clean!

@patrickvonplaten patrickvonplaten merged commit ea8ae8c into main Jun 15, 2023
@patrickvonplaten patrickvonplaten deleted the add_prior_tests branch June 15, 2023 15:43
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* relax tolerance slightly

* Add more tests

* upload readme

* upload readme

* Apply suggestions from code review

* Improve API Autoencoder KL

* finalize

* finalize tests

* finalize tests

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>

* up

---------

Co-authored-by: Sayak Paul <[email protected]>
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* relax tolerance slightly

* Add more tests

* upload readme

* upload readme

* Apply suggestions from code review

* Improve API Autoencoder KL

* finalize

* finalize tests

* finalize tests

* Apply suggestions from code review

Co-authored-by: Sayak Paul <[email protected]>

* up

---------

Co-authored-by: Sayak Paul <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants