asprenger / ray_vllm_inference

A simple service that integrates vLLM with Ray Serve for fast and scalable LLM serving.

Date Created 2023-10-28 (about a year ago)
Commits 52 (last one 11 months ago)
Stargazers 54 (0 this week)
Watchers 2 (0 this week)
Forks 4
License apache-2.0
Ranking

RepositoryStats indexes 584,353 repositories, of these asprenger/ray_vllm_inference is ranked #439,672 (25th percentile) for total stargazers, and #478,709 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #83,651/116,326.

asprenger/ray_vllm_inference is also tagged with popular topics, for these it's ranked: pytorch (#4,772/5948),  llm (#1,962/2726),  transformer (#790/1000),  mlops (#327/413),  inference (#232/301),  llmops (#131/164)

Other Information

asprenger/ray_vllm_inference has Github issues enabled, there are 4 open issues and 0 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

52 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-11-19 @ 08:07am, id: 711332577 / R_kgDOKmYS4Q