Scan and block sensitive photos and content in Node.JS

In this article, we will scan photos for sensitive content, such as nudity, violence, and other potentially offensive material. Based on a sensitivity score, we can then choose to block or not block this content.

This is actually pretty easy to accomplish. The first thing we need to do is add a reference to our package.json for the image library:

"dependencies": {
"cloudmersive-image-api-client": "^1.1.4"

Now we will simply call nsfwClassify:

var CloudmersiveImageApiClient = require('cloudmersive-image-api-client');
var defaultClient = CloudmersiveImageApiClient.ApiClient.instance;
// Configure API key authorization: Apikey
var Apikey = defaultClient.authentications['Apikey'];
Apikey.apiKey = 'YOUR API KEY';
var apiInstance = new CloudmersiveImageApiClient.NsfwApi();var imageFile = Buffer.from(fs.readFileSync("C:\\temp\\inputfile.jpg").buffer); // File | Image file to perform the operation on. Common file formats such as PNG, JPEG are supported.var callback = function(error, data, response) {
if (error) {
} else {
console.log('API called successfully. Returned data: ' + data);
apiInstance.nsfwClassify(imageFile, callback);

This will return a data structure with three values — Successful, which indicates whether the classification operation was sucessful; Score which is between 0.0 and 1.0, and any values above 0.8 are high probability NSFW content; and finally ClassificationOutcome which is an enum that helps you identify the classification of the image.

You can get an API key from Cloudmersive that includes 50,000 API calls/month at no cost and with no expiration.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store