You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`processor (`dict` of `AttentionProcessor` or `AttentionProcessor`):
345
345
The instantiated processor class or a dictionary of processor classes that will be set as the processor
346
346
of **all** `Attention` layers.
347
-
In case `processor` is a dict, the key needs to define the path to the corresponding cross attention processor. This is strongly recommended when setting trainablae attention processors.:
347
+
In case `processor` is a dict, the key needs to define the path to the corresponding cross attention processor. This is strongly recommended when setting trainable attention processors.:
Copy file name to clipboardExpand all lines: src/diffusers/models/transformer_2d.py
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -105,7 +105,7 @@ def __init__(
105
105
self.attention_head_dim=attention_head_dim
106
106
inner_dim=num_attention_heads*attention_head_dim
107
107
108
-
# 1. Transformer2DModel can process both standard continous images of shape `(batch_size, num_channels, width, height)` as well as quantized image embeddings of shape `(batch_size, num_image_vectors)`
108
+
# 1. Transformer2DModel can process both standard continuous images of shape `(batch_size, num_channels, width, height)` as well as quantized image embeddings of shape `(batch_size, num_image_vectors)`
109
109
# Define whether input is continuous or discrete depending on configuration
110
110
self.is_input_continuous= (in_channelsisnotNone) and (patch_sizeisNone)
`processor (`dict` of `AttentionProcessor` or `AttentionProcessor`):
392
392
The instantiated processor class or a dictionary of processor classes that will be set as the processor
393
393
of **all** `Attention` layers.
394
-
In case `processor` is a dict, the key needs to define the path to the corresponding cross attention processor. This is strongly recommended when setting trainablae attention processors.:
394
+
In case `processor` is a dict, the key needs to define the path to the corresponding cross attention processor. This is strongly recommended when setting trainable attention processors.:
0 commit comments