wangcx18 / llm-vscode-inference-server

An endpoint server for efficiently serving quantized open-source LLMs for code.

Date Created 2023-09-25 (about a year ago)
Commits 3 (last one 11 months ago)
Stargazers 52 (0 this week)
Watchers 1 (0 this week)
Forks 8
License apache-2.0
Ranking

RepositoryStats indexes 565,279 repositories, of these wangcx18/llm-vscode-inference-server is ranked #439,242 (22nd percentile) for total stargazers, and #521,451 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #82,853/111,292.

wangcx18/llm-vscode-inference-server is also tagged with popular topics, for these it's ranked: llm (#1,814/2462),  vscode-extension (#610/768),  llm-inference (#108/141)

Other Information

wangcx18/llm-vscode-inference-server has Github issues enabled, there are 4 open issues and 1 closed issue.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

3 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The only known language in this repository is Python

updated: 2024-09-15 @ 03:01am, id: 696044786 / R_kgDOKXzM8g