Skip to content

Commit b267d28

Browse files
[Versatile] fix attention mask (huggingface#1763)
1 parent c7b4acf commit b267d28

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -959,6 +959,7 @@ def forward(
959959

960960
encoded_states = []
961961
tokens_start = 0
962+
# attention_mask is not used yet
962963
for i in range(2):
963964
# for each of the two transformers, pass the corresponding condition tokens
964965
condition_state = encoder_hidden_states[:, tokens_start : tokens_start + self.condition_lengths[i]]
@@ -967,7 +968,6 @@ def forward(
967968
input_states,
968969
encoder_hidden_states=condition_state,
969970
timestep=timestep,
970-
attention_mask=attention_mask,
971971
return_dict=False,
972972
)[0]
973973
encoded_states.append(encoded_state - input_states)

0 commit comments

Comments
 (0)