fla-org / flash-linear-attention

Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton

Date Created 2023-12-20 (about a year ago)
Commits 929 (last one a day ago)
Stargazers 1,526 (43 this week)
Watchers 29 (0 this week)
Forks 75
License mit
Ranking

RepositoryStats indexes 599,932 repositories, of these fla-org/flash-linear-attention is ranked #34,549 (94th percentile) for total stargazers, and #76,111 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,339/120,502.

fla-org/flash-linear-attention is also tagged with popular topics, for these it's ranked: natural-language-processing (#173/1431),  large-language-models (#120/1102)

Other Information

fla-org/flash-linear-attention has Github issues enabled, there are 4 open issues and 66 closed issues.

Homepage URL: https://github.com/fla-org/flash-linear-attention

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

929 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2025-01-02 @ 05:34am, id: 733802106 / R_kgDOK7zueg