Skip to content

Commit

Permalink
update flash attn select (#54630) (#54716)
Browse files Browse the repository at this point in the history
  • Loading branch information
FeixLiu committed Jun 19, 2023
1 parent 570daa1 commit feff99f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion python/paddle/nn/functional/flash_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ def _math_attention(


def _select_sdp_cuda(head_dim):
if head_dim < 128:
if head_dim <= 128:
return "flash_attn"
else:
return "mem_efficient"
Expand Down

0 comments on commit feff99f

Please sign in to comment.