megvii-research / mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf

Date Created 2022-03-16 (2 years ago)
Commits 9 (last one about a year ago)
Stargazers 816 (3 this week)
Watchers 8 (0 this week)
Forks 124
License unknown
Ranking

RepositoryStats indexes 595,856 repositories, of these megvii-research/mdistiller is ranked #62,219 (90th percentile) for total stargazers, and #246,776 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #9,810/119,431.

megvii-research/mdistiller is also tagged with popular topics, for these it's ranked: deep-learning (#1,500/8512),  pytorch (#968/6025),  computer-vision (#450/3130),  knowledge-distillation (#20/159)

Other Information

megvii-research/mdistiller has Github issues enabled, there are 20 open issues and 45 closed issues.

There have been 2 releases, the latest one was published on 2023-11-05 (about a year ago) with the name dot_checkpoints.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

9 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The only known language in this repository is Python

updated: 2024-12-19 @ 03:25am, id: 470505142 / R_kgDOHAtWtg