yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

Date Created 2019-12-18 (4 years ago)
Commits 1,526 (last one 3 days ago)
Stargazers 1,293 (2 this week)
Watchers 18 (0 this week)
Forks 124
License mit
Ranking

RepositoryStats indexes 523,840 repositories, of these yoshitomo-matsubara/torchdistill is ranked #37,754 (93rd percentile) for total stargazers, and #119,320 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,660/100,813.

yoshitomo-matsubara/torchdistill is also tagged with popular topics, for these it's ranked: pytorch (#632/5505),  nlp (#274/2227),  natural-language-processing (#184/1311),  object-detection (#136/1065),  transformer (#85/896),  semantic-segmentation (#40/471),  image-classification (#50/360)

Other Information

There have been 24 releases, the latest one was published on 2024-03-27 (2 months ago) with the name New KD methods, updated YAML constructors, and low-level loss support.

Homepage URL: https://yoshitomo-matsubara.net/torchdistill/

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

566 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The only known language in this repository is Python

Opengraph Image
yoshitomo-matsubara/torchdistill

updated: 2024-05-30 @ 05:57am, id: 228891845 / R_kgDODaScxQ