asprenger / ray_vllm_inference

A simple service that integrates vLLM with Ray Serve for fast and scalable LLM serving.

Date Created 2023-10-28 (11 months ago)
Commits 52 (last one 9 months ago)
Stargazers 49 (3 this week)
Watchers 2 (0 this week)
Forks 4
License apache-2.0
Ranking

RepositoryStats indexes 565,279 repositories, of these asprenger/ray_vllm_inference is ranked #456,672 (19th percentile) for total stargazers, and #467,523 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #86,208/111,292.

asprenger/ray_vllm_inference is also tagged with popular topics, for these it's ranked: pytorch (#4,881/5822),  llm (#1,857/2462),  transformer (#799/976),  mlops (#325/404),  inference (#239/287),  llmops (#122/150)

Other Information

asprenger/ray_vllm_inference has Github issues enabled, there are 4 open issues and 0 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

52 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-09-28 @ 01:22pm, id: 711332577 / R_kgDOKmYS4Q