13 results found Sort:

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Created 2019-10-17
349 commits to main branch, last one about a year ago
Pytorch implementation of various Knowledge Distillation (KD) methods.
Created 2019-02-15
65 commits to master branch, last one 3 years ago
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
Created 2024-02-08
52 commits to main branch, last one about a month ago
SimpleAICV:pytorch training and testing examples.
Created 2020-05-31
89 commits to master branch, last one 8 days ago
模型压缩的小白入门教程
Created 2023-12-28
136 commits to main branch, last one 14 days ago
20
138
apache-2.0
2
Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022
Created 2022-05-21
19 commits to main branch, last one about a year ago
[ECCV2022] Factorizing Knowledge in Neural Networks
Created 2022-06-27
91 commits to main branch, last one 2 years ago
Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.
Created 2022-03-11
39 commits to main branch, last one 8 months ago
Awesome-3D/Multimodal-Anomaly-Detection-and-Localization/Segmentation/3D-KD/3D-knowledge-distillation
Created 2021-12-23
17 commits to main branch, last one about a year ago
5
65
apache-2.0
1
Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.
Created 2022-05-27
20 commits to main branch, last one about a year ago
13
64
mit
5
Matching Guided Distillation (ECCV 2020)
Created 2020-08-23
23 commits to master branch, last one 3 years ago
2
43
apache-2.0
2
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
Created 2022-04-13
68 commits to main branch, last one 11 months ago