lucidrains / flash-cosine-sim-attention

Implementation of fused cosine similarity attention in the same style as Flash Attention

Date Created 2022-08-04 (2 years ago)
Commits 297 (last one 2 years ago)
Stargazers 211 (0 this week)
Watchers 12 (0 this week)
Forks 12
License mit
Ranking

RepositoryStats indexes 627,741 repositories, of these lucidrains/flash-cosine-sim-attention is ranked #180,218 (71st percentile) for total stargazers, and #172,752 for total watchers. Github reports the primary language for this repository as Cuda, for repositories using this language it is ranked #121/388.

lucidrains/flash-cosine-sim-attention is also tagged with popular topics, for these it's ranked: deep-learning (#3,516/8776),  artificial-intelligence (#882/2253)

Other Information

lucidrains/flash-cosine-sim-attention has Github issues enabled, there are 2 open issues and 8 closed issues.

There have been 66 releases, the latest one was published on 2022-11-02 (2 years ago) with the name 0.1.40.

Star History

Github stargazers over time

25025020020015015010010050500020232023Jul '23Jul '2320242024Jul '24Jul '2420252025

Watcher History

Github watchers over time, collection started in '23

12121212121211.511.511111111111120232023Jul '23Jul '2320242024Jul '24Jul '2420252025

Recent Commit History

297 commits on the default branch (main) since jan '22

30030025025020020015015010010050500020232023Jul '23Jul '2320242024Jul '24Jul '2420252025

Yearly Commits

Commits to the default branch (main) per year

3003002502502002001501501001005050002022202220242024

Issue History

Total Issues
Open Issues
Closed Issues
10109988776655443322110020232023Jul '23Jul '2320242024Jul '24Jul '2420252025

Languages

The primary language is Cuda but there's also others...

CudaCudaPythonPythonC++C++MakefileMakefile

updated: 2025-03-14 @ 04:06am, id: 521331515 / R_kgDOHxLjOw