triton-inference-server / dali_backend

The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.

Date Created 2020-09-09 (4 years ago)
Commits 159 (last one 19 days ago)
Stargazers 129 (0 this week)
Watchers 9 (0 this week)
Forks 29
License mit
Ranking

RepositoryStats indexes 596,972 repositories, of these triton-inference-server/dali_backend is ranked #247,612 (59th percentile) for total stargazers, and #225,975 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #13,951/31,900.

triton-inference-server/dali_backend is also tagged with popular topics, for these it's ranked: python (#11,168/22355),  deep-learning (#4,478/8529),  image-processing (#547/1121),  gpu (#539/918)

Other Information

triton-inference-server/dali_backend has 6 open pull requests on Github, 156 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there are 22 open issues and 52 closed issues.

Homepage URL: https://docs.nvidia.com/deeplearning/dali/user-guide/docs/index.html

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

92 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is C++ but there's also others...

Opengraph Image
triton-inference-server/dali_backend

updated: 2024-12-24 @ 03:36pm, id: 294219252 / R_kgDOEYlt9A