10 results found Sort:

Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
Created 2019-01-11
52 commits to main branch, last one 2 years ago
175
1.7k
gpl-3.0
27
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Created 2020-07-08
357 commits to master branch, last one about a year ago
48
374
mit
10
Keras implementation of the Yahoo Open-NSFW model
Created 2021-11-01
225 commits to main branch, last one 4 months ago
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
Created 2021-09-16
157 commits to main branch, last one 11 months ago
75
192
unknown
11
NudeNet: NSFW Object Detection for TFJS and NodeJS
This repository has been archived (exclude archived)
Created 2021-03-25
25 commits to main branch, last one about a year ago
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
Created 2020-08-23
23 commits to master branch, last one 4 years ago
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Created 2020-05-26
2 commits to master branch, last one 4 years ago
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
This repository has been archived (exclude archived)
Created 2021-07-14
32 commits to master branch, last one 4 months ago
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
Created 2022-09-04
58 commits to main branch, last one 2 months ago