YanaiEliyahu / AdasOptimizer

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

Date Created 2020-04-10 (4 years ago)
Commits 99 (last one 3 years ago)
Stargazers 85 (0 this week)
Watchers 7 (0 this week)
Forks 11
License mit
Ranking

RepositoryStats indexes 584,777 repositories, of these YanaiEliyahu/AdasOptimizer is ranked #324,105 (45th percentile) for total stargazers, and #269,350 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #18,157/31,292.

YanaiEliyahu/AdasOptimizer is also tagged with popular topics, for these it's ranked: deep-learning (#5,446/8398),  machine-learning (#5,116/7935)

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

0 commits on the default branch (master) since jan '22

Inactive

No recent commits to this repository

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is C++ but there's also others...

updated: 2024-09-19 @ 09:04pm, id: 254716923 / R_kgDODy6r-w