OpenSparseLLMs / LLaMA-MoE-v2

🚀LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training

Date Created 2024-11-26 (22 days ago)
Commits 272 (last one 15 days ago)
Stargazers 62 (5 this week)
Watchers 2 (0 this week)
Forks 8
License apache-2.0
Ranking

RepositoryStats indexes 594,458 repositories, of these OpenSparseLLMs/LLaMA-MoE-v2 is ranked #406,022 (32nd percentile) for total stargazers, and #484,468 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #77,239/118,961.

OpenSparseLLMs/LLaMA-MoE-v2 is also tagged with popular topics, for these it's ranked: llama (#404/542),  fine-tuning (#131/187),  llama3 (#113/167)

Other Information

OpenSparseLLMs/LLaMA-MoE-v2 has 1 open pull request on Github, 0 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there is 1 open issue and 1 closed issue.

Homepage URL: https://arxiv.org/abs/2411.15708

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

272 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-12-16 @ 04:47pm, id: 894237728 / R_kgDONUz8IA