flashinfer-ai / flashinfer

FlashInfer: Kernel Library for LLM Serving

Date Created 2023-07-22 (about a year ago)
Commits 879 (last one 5 hours ago)
Stargazers 1,730 (76 this week)
Watchers 21 (0 this week)
Forks 170
License apache-2.0
Ranking

RepositoryStats indexes 601,131 repositories, of these flashinfer-ai/flashinfer is ranked #30,696 (95th percentile) for total stargazers, and #106,159 for total watchers. Github reports the primary language for this repository as Cuda, for repositories using this language it is ranked #15/361.

flashinfer-ai/flashinfer is also tagged with popular topics, for these it's ranked: pytorch (#540/6041),  gpu (#100/920),  cuda (#59/659),  llm-inference (#22/178)

Other Information

flashinfer-ai/flashinfer has 6 open pull requests on Github, 524 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there are 49 open issues and 112 closed issues.

There have been 18 releases, the latest one was published on 2024-12-23 (14 days ago) with the name v0.2.0.post1.

Homepage URL: https://flashinfer.ai

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

879 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Cuda but there's also others...

updated: 2025-01-06 @ 11:19pm, id: 669331976 / R_kgDOJ-UyCA