tspeterkim / flash-attention-minimal

Flash Attention in ~100 lines of CUDA (forward pass only)

Date Created 2024-03-07 (8 months ago)
Commits 8 (last one 8 months ago)
Stargazers 602 (0 this week)
Watchers 4 (0 this week)
Forks 54
License apache-2.0
Ranking

RepositoryStats indexes 579,555 repositories, of these tspeterkim/flash-attention-minimal is ranked #78,218 (87th percentile) for total stargazers, and #371,737 for total watchers. Github reports the primary language for this repository as Cuda, for repositories using this language it is ranked #39/335.

Other Information

tspeterkim/flash-attention-minimal has 3 open pull requests on Github, 0 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there are 4 open issues and 2 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

8 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Cuda but there's also others...

updated: 2024-11-05 @ 09:35pm, id: 768455010 / R_kgDOLc2xYg