yoshitomo-matsubara / torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
RepositoryStats indexes 523,840 repositories, of these yoshitomo-matsubara/torchdistill is ranked #37,754 (93rd percentile) for total stargazers, and #119,320 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,660/100,813.
There have been 24 releases, the latest one was published on 2024-03-27 (2 months ago) with the name New KD methods, updated YAML constructors, and low-level loss support.
Homepage URL: https://yoshitomo-matsubara.net/torchdistill/
Star History
Github stargazers over time
Watcher History
Github watchers over time, collection started in '23
Recent Commit History
566 commits on the default branch (main) since jan '22
Yearly Commits
Commits to the default branch (main) per year
Issue History
Languages
The only known language in this repository is Python
updated: 2024-05-30 @ 05:57am, id: 228891845 / R_kgDODaScxQ