uvipen / Contra-PPO-pytorch

Proximal Policy Optimization (PPO) algorithm for Contra

Date Created 2019-09-06 (4 years ago)
Commits 3 (last one 3 years ago)
Stargazers 130 (0 this week)
Watchers 10 (0 this week)
Forks 30
License unknown
Ranking

RepositoryStats indexes 534,880 repositories, of these uvipen/Contra-PPO-pytorch is ranked #227,832 (57th percentile) for total stargazers, and #199,782 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #40,569/103,597.

uvipen/Contra-PPO-pytorch is also tagged with popular topics, for these it's ranked: deep-learning (#4,218/7826),  ai (#1,518/3141),  openai (#857/1883),  reinforcement-learning (#603/1190)

Other Information

uvipen/Contra-PPO-pytorch has 1 open pull request on Github, 0 pull requests have been merged over the lifetime of the repository.

Github issues are enabled, there is 1 open issue and 0 closed issues.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

0 commits on the default branch (master) since jan '22

Inactive

No recent commits to this repository

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The primary language is Python but there's also others...

updated: 2024-06-29 @ 06:40am, id: 206873096 / R_kgDODFSiCA