Tags: flash-algo/flash-sparse-attention Toggle v1.2.3's commit message Fix documentation and references for Flash Sparse Attention Toggle v1.2.2's commit message
Bump package version to 1.2.2 Toggle v1.2.1's commit message
Toggle v1.2.0's commit message
Toggle v1.1.9's commit message
Toggle v1.1.8's commit message Merge pull request #176 from SmallDoges:auto-workflow Bump version to 1.1.8 Toggle v1.1.7's commit message Merge pull request #175 from SmallDoges:auto-workflow Increase GitHub Actions build timeout to 6 hours Toggle v1.1.6's commit message Merge pull request #174 from SmallDoges:auto-workflow Remove CUDA architecture '120' for compatibility Toggle v1.1.5's commit message Merge pull request #173 from SmallDoges:auto-workflow Expand build matrix for ARM64 and additional CUDA architectures Toggle v1.1.4's commit message Merge pull request #172 from SmallDoges/auto-workflow Refine build matrix and CUDA architecture specifications
You can’t perform that action at this time.