yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

Date Created 2019-12-18 (4 years ago)
Commits 1,538 (last one about a month ago)
Stargazers 1,393 (1 this week)
Watchers 19 (0 this week)
Forks 131
License mit
Ranking

RepositoryStats indexes 584,353 repositories, of these yoshitomo-matsubara/torchdistill is ranked #37,342 (94th percentile) for total stargazers, and #116,307 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #5,730/116,326.

yoshitomo-matsubara/torchdistill is also tagged with popular topics, for these it's ranked: pytorch (#638/5948),  nlp (#275/2400),  natural-language-processing (#183/1411),  object-detection (#133/1158),  transformer (#89/1000),  semantic-segmentation (#39/512),  image-classification (#48/393),  knowledge-distillation (#14/157)

Other Information

There have been 25 releases, the latest one was published on 2024-08-13 (3 months ago) with the name A new KD method, new benchmark results, and updated YAML constructors.

Homepage URL: https://yoshitomo-matsubara.net/torchdistill/

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

578 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The only known language in this repository is Python

Opengraph Image
yoshitomo-matsubara/torchdistill

updated: 2024-11-20 @ 12:17am, id: 228891845 / R_kgDODaScxQ