- Notifications
You must be signed in to change notification settings - Fork 45
Pull requests: flash-algo/flash-sparse-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Add support for targeted GPU architecture builds
#171 by LoserCheems was merged Sep 20, 2025 Loading…
Add wheel URL generation and caching for pre-built wheels in setup.py
#156 by LoserCheems was merged Sep 8, 2025 Loading…
Simplify backend imports and initialization
#153 by LoserCheems was merged Sep 7, 2025 Loading…
11 of 26 tasks
Add forward and backward performance tables to README and README_zh
#152 by LoserCheems was merged Sep 6, 2025 Loading…
7 of 26 tasks
add correct arch info for hopper platform
#149 by yiakwy-xpu-ml-framework-team was merged Sep 5, 2025 Loading…
3 of 32 tasks
Fix attention mask handling for invalid topk values
#138 by LoserCheems was merged Aug 29, 2025 Loading…
Remove paper citation and author information from README files
#102 by LoserCheems was merged Aug 10, 2025 Loading…
Renames Flash Attention to SDPA in benchmark suite
#67 by LoserCheems was merged Jul 10, 2025 Loading…
Improves code clarity and test coverage bug Something isn't working
#42 by LoserCheems was merged Jun 30, 2025 Loading…
Previous Next
ProTip! Exclude everything labeled
bug with -label:bug.