asprenger / ray_vllm_inference

A simple service that integrates vLLM with Ray Serve for fast and scalable LLM serving.

Date Created 2023-10-28 (about a year ago)
Commits 52 (last one 10 months ago)
Stargazers 52 (0 this week)
Watchers 2 (0 this week)
Forks 4
License apache-2.0
Ranking

RepositoryStats indexes 579,238 repositories, of these asprenger/ray_vllm_inference is ranked #447,482 (23rd percentile) for total stargazers, and #475,806 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #84,969/115,001.

asprenger/ray_vllm_inference is also tagged with popular topics, for these it's ranked: pytorch (#4,813/5908),  llm (#1,942/2654),  transformer (#798/992),  mlops (#330/410),  inference (#237/298),  llmops (#131/159)

Other Information

asprenger/ray_vllm_inference has Github issues enabled, there are 4 open issues and 0 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

52 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-10-19 @ 04:40am, id: 711332577 / R_kgDOKmYS4Q