Lance0218 / Pytorch-DistributedDataParallel-Training-Tricks

A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.

Date Created 2020-08-25 (4 years ago)
Commits 15 (last one 2 years ago)
Stargazers 60 (0 this week)
Watchers 2 (0 this week)
Forks 11
License mit
Ranking

RepositoryStats indexes 582,612 repositories, of these Lance0218/Pytorch-DistributedDataParallel-Training-Tricks is ranked #408,867 (30th percentile) for total stargazers, and #477,753 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #77,482/115,873.

Lance0218/Pytorch-DistributedDataParallel-Training-Tricks is also tagged with popular topics, for these it's ranked: pytorch (#4,532/5933),  distributed (#336/379)

Other Information

Lance0218/Pytorch-DistributedDataParallel-Training-Tricks has Github issues enabled, there is 1 open issue and 0 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

3 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-11-14 @ 05:56am, id: 290124569 / R_kgDOEUrzGQ