Tebmer / Awesome-Knowledge-Distillation-of-LLMs

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

Date Created 2024-02-08 (9 months ago)
Commits 52 (last one 16 days ago)
Stargazers 630 (3 this week)
Watchers 10 (0 this week)
Forks 38
License unknown
Ranking

RepositoryStats indexes 579,238 repositories, of these Tebmer/Awesome-Knowledge-Distillation-of-LLMs is ranked #75,650 (87th percentile) for total stargazers, and #205,652 for total watchers.

Tebmer/Awesome-Knowledge-Distillation-of-LLMs is also tagged with popular topics, for these it's ranked: llm (#553/2654),  compression (#86/444),  knowledge-distillation (#22/154)

Other Information

Tebmer/Awesome-Knowledge-Distillation-of-LLMs has 1 open pull request on Github, 2 pull requests have been merged over the lifetime of the repository.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

52 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

We don't have any language data for this repository

It's a mystery

updated: 2024-11-06 @ 09:34pm, id: 754706306 / R_kgDOLPvngg