
Datasets:
NSFW_MultiDomain
The NSFW_MultiDomain dataset is a curated image classification dataset focused on multi-domain adult content recognition. It consists of 5 distinct categories aimed at facilitating the development of robust NSFW (Not Safe For Work) image classification models. This dataset enables training and benchmarking of models that can distinguish between subtle variations in explicit and non-explicit content across artistic, animated, and real-world imagery.
Classes
The dataset includes the following classification labels:
- Class 0: Anime Picture
- Class 1: Hentai
- Class 2: Normal (Safe-for-Work content)
- Class 3: Pornography
- Class 4: Enticing or Sensual (suggestive but not explicit)
These categories were selected to improve performance in nuanced adult content detection and to enable precise moderation across platforms with varied content policies.
Trained Models on This Dataset
Several models have been trained using the NSFW_MultiDomain dataset:
- π https://huggingface.co/prithivMLmods/Mature-Content-Detection
- π https://huggingface.co/prithivMLmods/siglip2-x256-explicit-content
- π https://huggingface.co/prithivMLmods/siglip2-x256p32-explicit-content
- π https://huggingface.co/strangerguardhf/nsfw_image_detector
These models are fine-tuned on various architectures including SigLIP variants and custom classifiers for NSFW detection.
Dataset Structure
- Format: Image files with associated classification labels
- Domains: Anime, hentai, photography, illustrations, and mixed content
- Number of classes: 5
- Primary task: Multiclass image classification
Intended Use
The dataset is intended for research and deployment in:
- Content moderation tools
- Adult content filtering systems
- Safety-aware generative models
- Multimodal models involving vision and language
Note: This dataset contains sexually explicit material and should only be used in appropriate, legally compliant settings with appropriate safeguards.
Dataset Access
Dataset repository: https://huggingface.co/datasets/strangerguardhf/NSFW_MultiDomain
To load the dataset using the π€ Datasets library:
from datasets import load_dataset
dataset = load_dataset("strangerguardhf/NSFW-MultiDomain-Classification")
Licensing and Limitations
- License: Refer to the dataset page for explicit license details.
- Usage Warning: This dataset is only appropriate for users and institutions that are legally permitted and ethically equipped to handle NSFW data.
- Bias & Fairness: Like many NSFW datasets, there may be cultural or demographic
- Downloads last month
- 42
Models trained or fine-tuned on strangerguardhf/NSFW-MultiDomain-Classification
