sustcsonglin / flash-linear-attention

Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton

Date Created 2023-12-20 (9 months ago)
Commits 679 (last one 2 days ago)
Stargazers 1,229 (18 this week)
Watchers 25 (0 this week)
Forks 66
License mit
Ranking

RepositoryStats indexes 565,279 repositories, of these sustcsonglin/flash-linear-attention is ranked #41,320 (93rd percentile) for total stargazers, and #87,556 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #6,346/111,292.

sustcsonglin/flash-linear-attention is also tagged with popular topics, for these it's ranked: natural-language-processing (#200/1381),  large-language-models (#119/981)

Other Information

sustcsonglin/flash-linear-attention has Github issues enabled, there are 2 open issues and 42 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

679 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The only known language in this repository is Python

updated: 2024-09-28 @ 03:43pm, id: 733802106 / R_kgDOK7zueg