vitoplantamura / OnnxStream

Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK.

Date Created 2023-07-14 (about a year ago)
Commits 58 (last one about a month ago)
Stargazers 1,855 (0 this week)
Watchers 29 (0 this week)
Forks 84
License other
Ranking

RepositoryStats indexes 584,353 repositories, of these vitoplantamura/OnnxStream is ranked #27,667 (95th percentile) for total stargazers, and #75,562 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #1,465/31,270.

vitoplantamura/OnnxStream is also tagged with popular topics, for these it's ranked: machine-learning (#819/7926),  raspberry-pi (#88/1220),  wasm (#95/1065),  webassembly (#88/993),  stable-diffusion (#74/872),  llama (#77/519),  onnx (#48/356)

Other Information

vitoplantamura/OnnxStream has Github issues enabled, there are 48 open issues and 27 closed issues.

There have been 2 releases, the latest one was published on 2024-01-14 (10 months ago) with the name win-x64: LLM chat app: TinyLlama 1.1B and Mistral 7B (with optional cuBLAS support).

Homepage URL: https://yolo.vitoplantamura.com/

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

58 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is C++ but there's also others...

updated: 2024-11-19 @ 08:40am, id: 666197652 / R_kgDOJ7VelA