vitoplantamura / OnnxStream

Lightweight inference library for ONNX files, written in C++. It can run SDXL on a RPI Zero 2 but also Mistral 7B on desktops and servers.

Date Created 2023-07-14 (10 months ago)
Commits 33 (last one 14 hours ago)
Stargazers 1,767 (6 this week)
Watchers 25 (0 this week)
Forks 75
License other
Ranking

RepositoryStats indexes 523,840 repositories, of these vitoplantamura/OnnxStream is ranked #27,362 (95th percentile) for total stargazers, and #86,287 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #1,426/28,144.

vitoplantamura/OnnxStream is also tagged with popular topics, for these it's ranked: machine-learning (#806/7264),  raspberry-pi (#82/1141),  stable-diffusion (#70/769),  llama (#63/411)

Other Information

vitoplantamura/OnnxStream has Github issues enabled, there are 39 open issues and 24 closed issues.

There have been 2 releases, the latest one was published on 2024-01-14 (4 months ago) with the name win-x64: LLM chat app: TinyLlama 1.1B and Mistral 7B (with optional cuBLAS support).

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

33 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is C++ but there's also others...

updated: 2024-05-31 @ 08:04pm, id: 666197652 / R_kgDOJ7VelA