Skip to content
NSFW detection on the client-side via Tensorflow JS
Branch: master
Clone or download
GantMan Merge pull request #47 from infinitered/dependabot/npm_and_yarn/tslin…

Bump tslint from 5.13.1 to 5.14.0
Latest commit 2ed1c16 Mar 15, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
_art Delete example.gif Feb 18, 2019
example/nsfw_demo move to new and improved model Mar 5, 2019
.gitignore now includes regression test + updates Feb 22, 2019
.npmignore now includes regression test + updates Feb 22, 2019
LICENSE Fix contributors badge not rendering in md Mar 1, 2019
tsconfig.json TS base code Feb 13, 2019
yarn.lock Merge pull request #47 from infinitered/dependabot/npm_and_yarn/tslin… Mar 15, 2019


Client-side indecent content checking

All Contributors

A simple JavaScript library to help you quickly identify unseemly images; all in the client's browser. NSFWJS isn't perfect, but it's pretty accurate (~90% from our test set of 15,000 test images)... and it's getting more accurate all the time.

Why would this be useful? Check out the announcement blog post.

demo example

The library categorizes image probabilities in the following 5 classes:

  • Drawing - safe for work drawings (including anime)
  • Hentai - hentai and pornographic drawings
  • Neutral - safe for work neutral images
  • Porn - pornographic images, sexual acts
  • Sexy - sexually explicit images, not pornography

The demo is a continuous deployment source - Give it a go:

How to use the module

With async/await support:

import * as nsfwjs from 'nsfwjs'

const img = document.getElementById('img')

// Load model from my S3.
// See the section hosting the model files on your site.
const model = await nsfwjs.load()

// Classify the image
const predictions = await model.classify(img)
console.log('Predictions: ', predictions)

Without async/await support:

import * as nsfwjs from 'nsfwjs'

const img = document.getElementById('img')

// Load model from my S3.
// See the section hosting the model files on your site.
nsfwjs.load().then(function (model) {
  model.classify(img).then(function (predictions) {
    // Classify the image
    console.log('Predictions: ', predictions)


load the model

Before you can classify any images, you'll need to load the model. For many reasons, you should use the optional parameter and load the model from your website. Review how in the install directions.

const model = nsfwjs.load('/path/to/model/directory/')


  • optional URL to the model.json


  • Ready to use NSFWJS model object

classify an image

This function can take any browser-based image elements (, , ) and return an array of most likely predictions and their confidence levels.

// Return top 3 guesses (instead of all 5)
const predictions = await model.classify(img, 3)


  • Tensor, Image data, Image element, video element, or canvas element to check
  • Number of results to return (default all 5)


  • Array of objects that contain className and probability. Array size is determined by the second parameter in the classify function.


NSFWJS is powered by Tensorflow.JS as a peer dependency. If your project does not already have TFJS you'll need to add it.

# peer dependency
$ yarn add @tensorflow/tfjs
# install NSFWJS
$ yarn add nsfwjs

Host your own model

The magic that powers NSFWJS is the NSFW detection model. By default, this node module is pulling from my S3, but I make no guarantees that I'll keep that download link available forever. It's best for the longevity of your project that you download and host your own version of the model files. You can then pass the relative URL to your hosted files in the load function. If you can come up with a way to bundle the model into the NPM package, I'd love to see a PR to this repo!

Run the Example

The demo that powers is available in the example folder.

To run the demo, run yarn prep which will copy the latest code into the demo. After that's done, you can cd into the demo folder and run with yarn start.


The model was trained in Keras over several days and 60+ Gigs of data. Be sure to check out the model code which was trained on data provided by Alexander Kim's nsfw_data_scraper.

Open Source

NSFWJS, as open source, is free to use and always will be ❤️. It's MIT licensed, and we'll always do our best to help and quickly answer issues. If you'd like to get a hold of us, join our community slack.


Infinite Red offers premium training and support. Email us at to get in touch.


Thanks goes to these wonderful people (emoji key):

Gant Laborde
Gant Laborde

💬 📝 💻 💡 🤔 🚇 👀 ⚠️
Jamon Holmgren
Jamon Holmgren

📖 🤔
Jeff Studenski
Jeff Studenski


This project follows the all-contributors specification. Contributions of any kind welcome!

You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.