Scan and block sensitive photos and content in PHP

Using the following API, we will be able to assign photos a sensitivity score based on potential NSFW material that it might contain, including nudity and violence. We can then use this score to determine if each photo will be blocked from use.

To begin, let us add Image Recognition to our library:

"require": {
"cloudmersive/cloudmersive_imagerecognition_api_client": "^1.4",
}

Then call nsfwClassify:

<?php
require_once(__DIR__ . '/vendor/autoload.php');
// Configure API key authorization: Apikey
$config = Swagger\Client\Configuration::getDefaultConfiguration()->setApiKey('Apikey', 'YOUR_API_KEY');
$apiInstance = new Swagger\Client\Api\NsfwApi(


new GuzzleHttp\Client(),
$config
);
$image_file = "/path/to/file"; // \SplFileObject | Image file to perform the operation on. Common file formats such as PNG, JPEG are supported.
try {
$result = $apiInstance->nsfwClassify($image_file);
print_r($result);
} catch (Exception $e) {
echo 'Exception when calling NsfwApi->nsfwClassify: ', $e->getMessage(), PHP_EOL;
}
?>

This will supply us with 3 values. First, whether it was successful or not. Second, the aforementioned sensitivity score between 0.0 and 1.0, with .8 and above indicating high risk. Third, a ClassificationOutcome enum to assist with classifying the image in question.

And we are done! Wave goodbye to issues with NSFW content.

Written by

There’s an API for that. Cloudmersive is a leader in Highly Scalable Cloud APIs.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store