Skip to content

Bump FA2 to 2.7.4.post1 #1728

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 26 commits into from
Mar 12, 2025
Merged

Bump FA2 to 2.7.4.post1 #1728

merged 26 commits into from
Mar 12, 2025

Conversation

KuuCi
Copy link
Contributor

@KuuCi KuuCi commented Feb 24, 2025

FA2 2.6.3 doesn't support wheels with torch >2.4

FA updated the unpadding return values to return 5 outputs instead of 4:
https://github.com/Dao-AILab/flash-attention/blob/08f4c802c450708a86a92b226cba5663be81aead/flash_attn/bert_padding.py#L98

MPT doesn't actually support this FA change so we will gracefully fail MPT

Testing:
Regression test: https://databricks.slack.com/archives/C05T1A4UMT8/p1741210602072179

@KuuCi KuuCi force-pushed the bump-fa2-2.7.4.post1 branch from 05540ed to 0452eaf Compare February 27, 2025 02:30
@KuuCi KuuCi force-pushed the bump-fa2-2.7.4.post1 branch from 7e4d317 to 2adebe4 Compare February 28, 2025 23:28
@KuuCi KuuCi requested review from milocress and dakinggg March 1, 2025 03:29
@KuuCi KuuCi marked this pull request as ready for review March 1, 2025 03:29
@KuuCi KuuCi requested review from a team as code owners March 1, 2025 03:29
@KuuCi KuuCi requested a review from dakinggg March 12, 2025 05:26
Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@KuuCi KuuCi merged commit f3c6ec2 into main Mar 12, 2025
9 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants