10 results found Sort:
- Filter by Primary Language:
- Python (3)
- JavaScript (2)
- TypeScript (2)
- C# (1)
- CSS (1)
- Shell (1)
- +
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
Created
2019-01-11
52 commits to main branch, last one 2 years ago
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Created
2020-07-08
357 commits to master branch, last one about a year ago
Keras implementation of the Yahoo Open-NSFW model
Created
2021-11-01
225 commits to main branch, last one 3 months ago
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
Created
2021-09-16
157 commits to main branch, last one 10 months ago
NudeNet: NSFW Object Detection for TFJS and NodeJS
This repository has been archived
(exclude archived)
Created
2021-03-25
25 commits to main branch, last one about a year ago
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
Created
2020-08-23
23 commits to master branch, last one 4 years ago
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Created
2020-05-26
2 commits to master branch, last one 4 years ago
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
This repository has been archived
(exclude archived)
Created
2021-07-14
32 commits to master branch, last one 3 months ago
Rest API Written In Python To Classify NSFW Images.
Created
2021-04-27
10 commits to master branch, last one 9 months ago
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
Created
2022-09-04
58 commits to main branch, last one about a month ago