66RING / tiny-flash-attention

flash attention tutorial written in python, triton, cuda, cutlass

Date Created 2023-12-07 (11 months ago)
Commits 27 (last one 4 months ago)
Stargazers 193 (0 this week)
Watchers 3 (0 this week)
Forks 14
License unknown
Ranking

RepositoryStats indexes 579,555 repositories, of these 66RING/tiny-flash-attention is ranked #182,310 (69th percentile) for total stargazers, and #420,420 for total watchers. Github reports the primary language for this repository as Cuda, for repositories using this language it is ranked #109/335.

Other Information

66RING/tiny-flash-attention has Github issues enabled, there are 7 open issues and 2 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

27 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Cuda but there's also others...

updated: 2024-11-05 @ 07:03pm, id: 728461785 / R_kgDOK2tx2Q