How to scan and block sensitive photos in Salesforce Apex
User-submitted content can be a real goldmine, but it can cause a slew of problems if you don’t handle it right. Racy NSFW photos is the most common problem, and that can cause user outrage or get you flagged by search engines. So we know this is a real problem, how can we deal with it? Train an AI through deep learning to recognize naughty images? That’s a ton of work, but luckily we’ve already done it for you and packaged it in a handy API.
We begin, simply enough, by downloading our Apex client and extracting the client folder into our project folder.
Our test photograph can now be run through nsfwClassify:
SwagNsfwApi api = new SwagNsfwApi();SwagClient client = api.getClient();// Configure API key authorization: ApikeyApiKeyAuth Apikey = (ApiKeyAuth) client.getAuthentication('Apikey');Apikey.setApiKey('YOUR API KEY');Map<String, Object> params = new Map<String, Object>{'imageFile' => Blob.valueOf('Sample text file\nContents')};try {// cross your fingersSwagNsfwResult result = api.nsfwClassify(params);System.debug(result);} catch (Swagger.ApiException e) {// ...handle your exceptions}
Done! Let’s look at an example photo and result.
And our result:
{
"Successful": true,
"Score": 0.0203384142369,
"ClassificationOutcome": "SafeContent_HighProbability"
}
As you can see we were given a score of 0–1 (higher being more likely to be NSFW) and a classification. This allows you to choose a score threshold that suits your safety needs.