Detect Nudity/NSFW/Racy photos in Node.js
Here we want to detect nudity, or other types of not-safe-for-work or racy photos, so that we can block or flag for review this content. This approach will leverage Deep Learning but can run on any regular Node.js server.
The first step will be to add a reference to the package in package.json
"dependencies": {
"cloudmersive-image-api-client": "^1.1.4"
}
Next up, we need to import the library:
var CloudmersiveImageApiClient = require('cloudmersive-image-api-client');
Finally, we need to call the method nsfwClassify:
var defaultClient = CloudmersiveImageApiClient.ApiClient.instance;// Configure API key authorization: Apikey
var Apikey = defaultClient.authentications['Apikey'];
Apikey.apiKey = 'YOUR API KEY';var apiInstance = new CloudmersiveImageApiClient.NsfwApi();var imageFile = Buffer.from(fs.readFileSync("C:\\temp\\inputfile").buffer); // File | Image file to perform the operation on. Common file formats such as PNG, JPEG are supported.var callback = function(error, data, response) {
if (error) {
console.error(error);
} else {
console.log('API called successfully. Returned data: ' + data);
}
};
apiInstance.nsfwClassify(imageFile, callback);
Here is an example output result for a nude image:
{
"Successful": true,
"Score": 0.995321691036,
"ClassificationOutcome": "UnsafeContent_HighProbability"
}
As you can see, it is correctly flagged as UnsafeContent_HighProbability and receives a Score of 0.99. Scores are between 0.0 and 1.0. Scores of 0.0–0.2 represent high probability safe content, while scores 0.8–1.0 represent high probability unsafe content. Content between 0.2 and 0.8 is of increasing raciness.
That’s it! We are done! This will work on a wide array of content.