lucidrains / flash-cosine-sim-attention

Implementation of fused cosine similarity attention in the same style as Flash Attention

Date Created 2022-08-04 (2 years ago)
Commits 297 (last one about a year ago)
Stargazers 207 (0 this week)
Watchers 12 (0 this week)
Forks 11
License mit
Ranking

RepositoryStats indexes 585,332 repositories, of these lucidrains/flash-cosine-sim-attention is ranked #174,588 (70th percentile) for total stargazers, and #177,675 for total watchers. Github reports the primary language for this repository as Cuda, for repositories using this language it is ranked #107/342.

lucidrains/flash-cosine-sim-attention is also tagged with popular topics, for these it's ranked: deep-learning (#3,452/8403),  artificial-intelligence (#824/2064)

Other Information

lucidrains/flash-cosine-sim-attention has Github issues enabled, there are 2 open issues and 8 closed issues.

There have been 66 releases, the latest one was published on 2022-11-02 (2 years ago) with the name 0.1.40.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

297 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Cuda but there's also others...

updated: 2024-11-17 @ 02:49am, id: 521331515 / R_kgDOHxLjOw