12 results found Sort:

175
1.7k
gpl-3.0
27
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Created 2020-07-08
357 commits to master branch, last one about a year ago
45
651
agpl-3.0
14
A Browser extension that enables you to navigate the web with respect for your Islamic values, protect your privacy and reduce browsing distractions by auto detecting and blurring "Haram" content.
Created 2023-09-22
153 commits to main branch, last one 5 months ago
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
Created 2021-09-16
157 commits to main branch, last one 11 months ago
75
192
unknown
11
NudeNet: NSFW Object Detection for TFJS and NodeJS
This repository has been archived (exclude archived)
Created 2021-03-25
25 commits to main branch, last one about a year ago
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Created 2020-05-26
2 commits to master branch, last one 4 years ago
A fast accurate API for detecting NSFW images.
Created 2024-04-22
49 commits to main branch, last one 6 months ago
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
This repository has been archived (exclude archived)
Created 2021-07-14
32 commits to master branch, last one 4 months ago
A Simple NSFW Classifier Based on Keras and Tensorflow
Created 2022-03-23
39 commits to main branch, last one 9 months ago
State-Of-The-Art & ready to use mini NLP models for Indian Languages
Created 2020-10-25
51 commits to main branch, last one 3 years ago
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
Created 2022-09-04
58 commits to main branch, last one 2 months ago
A NSFW/Safety Checker Node for ComfyUI
This repository has been archived (exclude archived)
Created 2023-11-26
15 commits to main branch, last one 7 months ago