Skip to content

Pull requests: Dao-AILab/flash-attention

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

fix noisy logger
#2414 opened Mar 31, 2026 by drisspg Loading…
[ROCM] Fix windows issues
#2385 opened Mar 23, 2026 by micmelesse Loading…
Fix missing seqlen_info param in softcap scoremod
#2366 opened Mar 17, 2026 by rucnyz Loading…
[CuTe, Sm100] PackGQA for backward
#2354 opened Mar 15, 2026 by reubenconducts Loading…
Add SM120 kernel-level paged KV cache support
#2348 opened Mar 13, 2026 by blake-snc Loading…
[varlen] add autograd function to zero out nan padding
#2324 opened Mar 10, 2026 by liangel-02 Loading…
Fp8 2 level accumulation
#2323 opened Mar 10, 2026 by PatrykSaffer Draft
[Fwd,Sm100] Tune ex2 emulation for softmax
#2316 opened Mar 7, 2026 by zobinHuang Loading…
ProTip! Find all pull requests that aren't related to any open issues with -linked:issue.