Tebmer / Awesome-Knowledge-Distillation-of-LLMs

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

Date Created 2024-02-08 (3 months ago)
Commits 47 (last one about a month ago)
Stargazers 305 (8 this week)
Watchers 4 (0 this week)
Forks 20
License unknown
Ranking

RepositoryStats indexes 523,840 repositories, of these Tebmer/Awesome-Knowledge-Distillation-of-LLMs is ranked #122,434 (77th percentile) for total stargazers, and #351,116 for total watchers.

Tebmer/Awesome-Knowledge-Distillation-of-LLMs is also tagged with popular topics, for these it's ranked: llm (#609/1875),  compression (#125/417)

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

47 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

We don't have any language data for this repository

It's a mystery

updated: 2024-05-31 @ 08:40pm, id: 754706306 / R_kgDOLPvngg