4 results found Sort:

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
Created 2024-02-08
47 commits to main branch, last one 2 months ago
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Created 2020-02-26
9 commits to master branch, last one 2 years ago
4
56
unknown
6
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Created 2024-02-20
17 commits to main branch, last one about a month ago
Deep Hash Distillation for Image Retrieval - ECCV 2022
Created 2021-11-22
70 commits to main branch, last one about a month ago