Skip to content

Commit 220ceb2

Browse files
JackWeiwjinminxi104
authored andcommitted
fix: expand shape of attn_mask (#10)
1 parent e4c3aca commit 220ceb2

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

lmdeploy/pytorch/engine/devices/ascend.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ def update_step_context(cls, step_context):
1717
single_attention_mask = torch.logical_not(
1818
torch.tril(
1919
torch.ones(step_context.q_seq_length[i],
20-
(step_context.kv_seq_length[i] + 31) & (~31),
20+
step_context.block_offsets.shape[1] * block_size,
2121
dtype=torch.bool).cuda(),
2222
diagonal=step_context.kv_seq_length[i] -
2323
step_context.q_seq_length[i],

0 commit comments

Comments
 (0)