thecml / dpsgd-optimizer

Amortized version of the differentially private SGD algorithm published in "Deep Learning with Differential Privacy" by Abadi et al. Enforces privacy by clipping and sanitising the gradients with Gaussian noise during training.

Date Created 2020-10-01 (4 years ago)
Commits 80 (last one 8 months ago)
Stargazers 41 (0 this week)
Watchers 2 (0 this week)
Forks 5
License mit
Ranking

RepositoryStats indexes 595,856 repositories, of these thecml/dpsgd-optimizer is ranked #515,470 (13th percentile) for total stargazers, and #485,301 for total watchers. Github reports the primary language for this repository as Python, for repositories using this language it is ranked #100,448/119,431.

thecml/dpsgd-optimizer is also tagged with popular topics, for these it's ranked: python (#20,129/22324),  deep-learning (#7,680/8512),  tensorflow (#2,122/2255),  privacy (#944/1043),  research (#466/508),  tensorflow2 (#227/249)

Other Information

There have been 1 release, the latest one was published on 2024-04-04 (8 months ago) with the name Initial public release.

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

4 commits on the default branch (master) since jan '22

Yearly Commits

Commits to the default branch (master) per year

Issue History

Languages

The only known language in this repository is Python

updated: 2024-12-06 @ 09:12am, id: 300303240 / R_kgDOEeZDiA