yoshitomo-matsubara / torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
RepositoryStats indexes 584,353 repositories, of these yoshitomo-matsubara/torchdistill is ranked #37,342 (94th percentile) for total stargazers, and #116,307 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,730/116,326.
There have been 25 releases, the latest one was published on 2024-08-13 (3 months ago) with the name A new KD method, new benchmark results, and updated YAML constructors.
Homepage URL: https://yoshitomo-matsubara.net/torchdistill/
Star History
Github stargazers over time
Watcher History
Github watchers over time, collection started in '23
Recent Commit History
578 commits on the default branch (main) since jan '22
Yearly Commits
Commits to the default branch (main) per year
Issue History
Languages
The only known language in this repository is Python
updated: 2024-11-20 @ 12:17am, id: 228891845 / R_kgDODaScxQ