madroidmaq / mlx-omni-server
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. It implements OpenAI-compatible API endpoints, enabling seamless integration with existing OpenAI SDK clients while leveraging the power of local ML inference.
RepositoryStats indexes 594,458 repositories, of these madroidmaq/mlx-omni-server is ranked #228,420 (62nd percentile) for total stargazers, and #376,884 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #41,590/118,961.
madroidmaq/mlx-omni-server has Github issues enabled, there is 1 open issue and 3 closed issues.
There have been 3 releases, the latest one was published on 2024-12-16 (a day ago) with the name v0.2.0.
Star History
Github stargazers over time
Watcher History
Github watchers over time, collection started in '23
Recent Commit History
93 commits on the default branch (main) since jan '22
Yearly Commits
Commits to the default branch (main) per year
Issue History
Languages
The only known language in this repository is Python
updated: 2024-12-17 @ 11:25pm, id: 883682853 / R_kgDONKvuJQ