sustcsonglin / flash-linear-attention

Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton

Date Created 2023-12-20 (11 months ago)
Commits 832 (last one 4 hours ago)
Stargazers 1,389 (37 this week)
Watchers 27 (0 this week)
Forks 71
License mit
Ranking

RepositoryStats indexes 589,134 repositories, of these sustcsonglin/flash-linear-attention is ranked #37,643 (94th percentile) for total stargazers, and #81,510 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,799/117,584.

sustcsonglin/flash-linear-attention is also tagged with popular topics, for these it's ranked: natural-language-processing (#184/1418),  large-language-models (#121/1061)

Other Information

sustcsonglin/flash-linear-attention has Github issues enabled, there are 4 open issues and 53 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

832 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-12-03 @ 02:00pm, id: 733802106 / R_kgDOK7zueg