- Notifications
You must be signed in to change notification settings - Fork 45
Pull requests: flash-algo/flash-sparse-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Add namespace_config to csrc
#2 by LoserCheems was merged May 14, 2025 Loading… updated May 14, 2025
Update golobal to Shared Memory operation
#7 by LoserCheems was merged May 19, 2025 Loading… updated May 19, 2025
Improves code clarity and test coverage bug Something isn't working
#42 by LoserCheems was merged Jun 30, 2025 Loading… updated Jun 30, 2025
Enables test case for 512x512 input dimensions
#50 by LoserCheems was merged Jul 1, 2025 Loading… updated Jul 1, 2025
Improves code formatting consistency in comments
#52 by LoserCheems was merged Jul 1, 2025 Loading… updated Jul 1, 2025
Fixes mask validation in forward kernel
#57 by LoserCheems was merged Jul 3, 2025 Loading… updated Jul 3, 2025
Renames Flash Attention to SDPA in benchmark suite
#67 by LoserCheems was merged Jul 10, 2025 Loading… updated Jul 10, 2025
Remove paper citation and author information from README files
#102 by LoserCheems was merged Aug 10, 2025 Loading… updated Aug 10, 2025
Fixes attention mask/bias shape documentation
#123 by LoserCheems was merged Aug 24, 2025 Loading… updated Aug 24, 2025
Fix attention mask handling for invalid topk values
#138 by LoserCheems was merged Aug 29, 2025 Loading… updated Aug 29, 2025
add correct arch info for hopper platform
#149 by yiakwy-xpu-ml-framework-team was merged Sep 5, 2025 Loading… updated Sep 5, 2025
3 of 32 tasks
Add wheel URL generation and caching for pre-built wheels in setup.py
#156 by LoserCheems was merged Sep 8, 2025 Loading… updated Sep 8, 2025
Add support for targeted GPU architecture builds
#171 by LoserCheems was merged Sep 20, 2025 Loading… updated Sep 20, 2025
Previous Next
ProTip! Add no:assignee to see everything that’s not assigned.