Skip to content

Conversation

@patil-suraj
Copy link
Contributor

@patil-suraj patil-suraj commented Sep 9, 2022

This PR removed the hardcoded value of 32 for norm_num_groups and makes sure that the arg is passed to all up, down, res, and attention blocks.

Fixes #410

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 9, 2022

The documentation is not available anymore as the PR was closed or merged.

@ezhang7423
Copy link

Thank you everyone!

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very cool @patil-suraj !


self.assertTrue(torch.allclose(output_slice, expected_output_slice, rtol=1e-2))

def test_forward_with_norm_groups(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to adjust norm groups inside the blocks cf https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/unet_blocks.py#L780
So not sure if it can work with any value because of the division with 4

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry @patil-suraj - let's really try to not change existing test (except for they are wrong). It would be really nice to just add a new test here

@patil-suraj
Copy link
Contributor Author

patil-suraj commented Sep 15, 2022

Sorry @patil-suraj - let's really try to not change existing test (except for they are wrong). It would be really nice to just add a new test here

Agree, I have already added a new common test test_forward_with_norm_groups, the change was leftover from previous commit.

@patil-suraj patil-suraj merged commit d144c46 into main Sep 15, 2022
@patil-suraj patil-suraj deleted the pass-norm_num_groups branch September 15, 2022 14:35
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
…locks (huggingface#442)

* pass norm_num_groups to unet blocs and attention

* fix UNet2DConditionModel

* add norm_num_groups arg in vae

* add tests

* remove comment

* Apply suggestions from code review
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Unet model only supports internal block channel sizes in multiples of 32

5 participants