4 results found Sort:
- Filter by Primary Language:
- Python (2)
- Shell (1)
- +
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
Created
2024-02-08
52 commits to main branch, last one 2 months ago
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Created
2020-02-26
9 commits to master branch, last one 2 years ago
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Created
2024-02-20
25 commits to main branch, last one 3 months ago
Deep Hash Distillation for Image Retrieval - ECCV 2022
Created
2021-11-22
72 commits to main branch, last one 5 months ago