How to Identify NSFW Content in an Image in Ruby

Cloudmersive
2 min readAug 26, 2021

Content that is classified as ‘Not Safe for Work’ (NSFW) usually features racy, offensive, or undesirable visualizations. To ensure none of this type of content makes its way to your company’s website or application, you can utilize the following API in Ruby to automatically detect NSFW content in an image and provide a NSFW score and classification.

Let’s start by installing the Ruby client:

gem 'cloudmersive-image-recognition-api-client', '~> 2.0.4'

After the installation, we can input the image, configure the API key, and call the function:

# load the gem
require 'cloudmersive-image-recognition-api-client'
# setup authorization
CloudmersiveImageRecognitionApiClient.configure do |config|
# Configure API key authorization: Apikey
config.api_key['Apikey'] = 'YOUR API KEY'
# Uncomment the following line to set a prefix for the API key, e.g. 'Bearer' (defaults to nil)
#config.api_key_prefix['Apikey'] = 'Bearer'
end
api_instance = CloudmersiveImageRecognitionApiClient::NsfwApi.newimage_file = File.new('/path/to/inputfile') # File | Image file to perform the operation on. Common file formats such as PNG, JPEG are supported.begin
#Not safe for work NSFW racy content classification
result = api_instance.nsfw_classify(image_file)
p result
rescue CloudmersiveImageRecognitionApiClient::ApiError => e
puts "Exception when calling NsfwApi->nsfw_classify: #{e}"
end

If you need to retrieve an API key, you can do so by registering for a free account on the Cloudmersive website; this provides 800 calls/month across our entire library of APIs.

--

--

Cloudmersive

There’s an API for that. Cloudmersive is a leader in Highly Scalable Cloud APIs.