kyegomez / MoE-Mamba

Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta

Date Created 2024-01-21 (about a year ago)
Commits 14 (last one about a year ago)
Stargazers 92 (0 this week)
Watchers 5 (0 this week)
Forks 5
License mit
Ranking

RepositoryStats indexes 609,066 repositories, of these kyegomez/MoE-Mamba is ranked #316,108 (48th percentile) for total stargazers, and #338,640 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #59,435/122,846.

kyegomez/MoE-Mamba is also tagged with popular topics, for these it's ranked: ai (#2,425/4341),  ml (#394/635)

Other Information

kyegomez/MoE-Mamba has 1 open pull request on Github, 1 pull request has been merged over the lifetime of the repository.

Homepage URL: https://discord.gg/GYbXvDGevY

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

14 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2025-01-28 @ 07:38pm, id: 746412048 / R_kgDOLH1YEA