yoshitomo-matsubara / torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
RepositoryStats indexes 595,856 repositories, of these yoshitomo-matsubara/torchdistill is ranked #37,272 (94th percentile) for total stargazers, and #111,217 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,744/119,431.
There have been 26 releases, the latest one was published on 2024-12-15 (6 days ago) with the name PyTorch 2.5 support, model migrations, end of Python 3.8 support.
Homepage URL: https://yoshitomo-matsubara.net/torchdistill/
Star History
Github stargazers over time
Watcher History
Github watchers over time, collection started in '23
Recent Commit History
595 commits on the default branch (main) since jan '22
Yearly Commits
Commits to the default branch (main) per year
Issue History
Languages
The only known language in this repository is Python
updated: 2024-12-20 @ 11:59pm, id: 228891845 / R_kgDODaScxQ