10 results found Sort:

Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
Created 2019-01-11
52 commits to main branch, last one 2 years ago
161
1.4k
gpl-3.0
26
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Created 2020-07-08
357 commits to master branch, last one 8 months ago
42
322
mit
10
Keras implementation of the Yahoo Open-NSFW model
Created 2021-11-01
198 commits to main branch, last one 5 months ago
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
Created 2021-09-16
157 commits to main branch, last one 4 months ago
30
126
unknown
7
NudeNet: NSFW Object Detection for TFJS and NodeJS
Created 2021-03-25
25 commits to main branch, last one about a year ago
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
Created 2020-08-23
23 commits to master branch, last one 3 years ago
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Created 2020-05-26
2 commits to master branch, last one 3 years ago
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
Created 2021-07-14
28 commits to master branch, last one 4 months ago
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
Created 2022-09-04
55 commits to main branch, last one 10 months ago