kyegomez / MoE-Mamba

Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta

Date Created 2024-01-21 (8 months ago)
Commits 14 (last one 8 months ago)
Stargazers 78 (3 this week)
Watchers 5 (0 this week)
Forks 3
License mit
Ranking

RepositoryStats indexes 565,279 repositories, of these kyegomez/MoE-Mamba is ranked #335,056 (41st percentile) for total stargazers, and #326,700 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #62,110/111,292.

kyegomez/MoE-Mamba is also tagged with popular topics, for these it's ranked: ai (#2,184/3586),  ml (#393/582)

Other Information

kyegomez/MoE-Mamba has 1 open pull request on Github, 1 pull request has been merged over the lifetime of the repository.

Github issues are enabled, there is 1 open issue and 2 closed issues.

Homepage URL: https://discord.gg/GYbXvDGevY

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

14 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-09-27 @ 11:21am, id: 746412048 / R_kgDOLH1YEA