bwconrad / soft-moe

PyTorch implementation of "From Sparse to Soft Mixtures of Experts"

Date Created 2023-08-16 (about a year ago)
Commits 16 (last one about a year ago)
Stargazers 40 (1 this week)
Watchers 3 (0 this week)
Forks 2
License apache-2.0
Ranking

RepositoryStats indexes 564,918 repositories, of these bwconrad/soft-moe is ranked #498,559 (12th percentile) for total stargazers, and #413,662 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #95,692/111,200.

bwconrad/soft-moe is also tagged with popular topics, for these it's ranked: machine-learning (#7,056/7699),  pytorch (#5,293/5820),  computer-vision (#2,675/2971),  transformer (#874/975)

Other Information

bwconrad/soft-moe has Github issues enabled, there are 2 open issues and 1 closed issue.

There have been 1 release, the latest one was published on 2023-08-22 (about a year ago) with the name 0.0.1.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

16 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

Languages

The only known language in this repository is Python

updated: 2024-09-25 @ 07:16am, id: 679378691 / R_kgDOKH5_Aw