YanaiEliyahu / AdasOptimizer

ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance

Date Created 2020-04-10 (4 years ago)
Commits 99 (last one 3 years ago)
Stargazers 85 (0 this week)
Watchers 7 (0 this week)
Forks 11
License mit
Ranking

RepositoryStats indexes 565,279 repositories, of these YanaiEliyahu/AdasOptimizer is ranked #316,000 (44th percentile) for total stargazers, and #265,423 for total watchers. Github reports the primary language for this repository as C++, for repositories using this language it is ranked #17,685/30,279.

YanaiEliyahu/AdasOptimizer is also tagged with popular topics, for these it's ranked: deep-learning (#5,333/8171),  machine-learning (#5,002/7698)

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

0 commits on the default branch (master) since jan '22

Inactive

No recent commits to this repository

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is C++ but there's also others...

updated: 2024-09-19 @ 09:04pm, id: 254716923 / R_kgDODy6r-w