Scan and block sensitive photos and content in Node.JS
In this article, we will scan photos for sensitive content, such as nudity, violence, and other potentially offensive material. Based on a sensitivity score, we can then choose to block or not block this content.
This is actually pretty easy to accomplish. The first thing we need to do is add a reference to our package.json for the image library:
"dependencies": {
"cloudmersive-image-api-client": "^1.1.4"
}
Now we will simply call nsfwClassify:
var CloudmersiveImageApiClient = require('cloudmersive-image-api-client');
var defaultClient = CloudmersiveImageApiClient.ApiClient.instance;// Configure API key authorization: Apikey
var Apikey = defaultClient.authentications['Apikey'];
Apikey.apiKey = 'YOUR API KEY';var apiInstance = new CloudmersiveImageApiClient.NsfwApi();var imageFile = Buffer.from(fs.readFileSync("C:\\temp\\inputfile.jpg").buffer); // File | Image file to perform the operation on. Common file formats such as PNG, JPEG are supported.var callback = function(error, data, response) {
if (error) {
console.error(error);
} else {
console.log('API called successfully. Returned data: ' + data);
}
};
apiInstance.nsfwClassify(imageFile, callback);
This will return a data structure with three values — Successful, which indicates whether the classification operation was sucessful; Score which is between 0.0 and 1.0, and any values above 0.8 are high probability NSFW content; and finally ClassificationOutcome which is an enum that helps you identify the classification of the image.
You can get an API key from Cloudmersive that includes 50,000 API calls/month at no cost and with no expiration.