CASE-Lab-UMD / Unified-MoE-Compression

The official implementation of the paper "Demystifying the Compression of Mixture-of-Experts Through a Unified Framework".

Date Created 2024-02-24 (10 months ago)
Commits 28 (last one about a month ago)
Stargazers 49 (0 this week)
Watchers 2 (0 this week)
Forks 5
License apache-2.0
Ranking

RepositoryStats indexes 595,856 repositories, of these CASE-Lab-UMD/Unified-MoE-Compression is ranked #475,677 (20th percentile) for total stargazers, and #485,301 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #91,382/119,431.

CASE-Lab-UMD/Unified-MoE-Compression is also tagged with popular topics, for these it's ranked: deep-learning (#7,138/8512),  machine-learning (#6,808/8063),  natural-language-processing (#1,234/1429),  large-language-models (#845/1090),  model-compression (#91/108)

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

28 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-12-11 @ 05:58pm, id: 762783783 / R_kgDOLXcoJw