Detect NSFW Content in an Image in Swift

Maintaining a website that allows users to upload content is a demanding task; with the wide variety of content that is available online, it’s hard to keep track of which materials contain racy or inappropriate content, and which do not. If you frequently encounter content on your site that could damage your company’s reputation, it may be time to integrate a safety net. In this quick tutorial, we will show you how to automatically detect ‘Not Safe for Work’ content from an image by using an API in Swift.

Simply input your image file and API key into the below example code to call the function:

import Foundation
#if canImport(FoundationNetworking)
import FoundationNetworking
#endif

If you need to obtain an API key, you can do so by registering for a free account on the Cloudmersive website; this will provide 800 calls/month across any of our APIs.

There’s an API for that. Cloudmersive is a leader in Highly Scalable Cloud APIs.