triton-inference-server / dali_backend

The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.

Date Created 2020-09-09 (4 years ago)
Commits 158 (last one about a month ago)
Stargazers 125 (1 this week)
Watchers 9 (0 this week)
Forks 29
License mit
Ranking

RepositoryStats indexes 584,777 repositories, of these triton-inference-server/dali_backend is ranked #249,556 (57th percentile) for total stargazers, and #224,086 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #14,056/31,292.

triton-inference-server/dali_backend is also tagged with popular topics, for these it's ranked: python (#11,223/22023),  deep-learning (#4,501/8398),  image-processing (#555/1093),  gpu (#537/900)

Other Information

triton-inference-server/dali_backend has 5 open pull requests on Github, 155 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there are 22 open issues and 52 closed issues.

Homepage URL: https://docs.nvidia.com/deeplearning/dali/user-guide/docs/index.html

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

91 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is C++ but there's also others...

Opengraph Image
triton-inference-server/dali_backend

updated: 2024-11-20 @ 04:33am, id: 294219252 / R_kgDOEYlt9A