Skip to main content

Building a Sexual Content Filter Using Python, Flask, and Docker

Danilo is a Senior Software Engineer with an Associate Degree and 4 years of experience in some of the most famous IT Companies in Brazil.

Let's learn to build a sexual content filter in Python (and more!)

Let's learn to build a sexual content filter in Python (and more!)

Want to Filter Sexual Content?

For a variety of reasons, many people on the internet are searching for a way to analyze images and block sexual content. Unfortunately, it's almost impossible to do so without thousands of images to train a convolutional neural network model. I'm making this article to show you that you can have a simple application that can do it for you, without worrying about neural networks stuff. We're going to use a convolutional neural network, but the model will be already trained, so you don't need to worry.

In This Article

  • How to create a Python Rest API with Flask.
  • How to create a simple service to check if the content is sexual or not.


  • Docker Installed.
  • Python 3 Installed.
  • Pip Installed.

Creating the Directory Structure

  • Open your favorite terminal.
  • Create a project's root directory where we're going to put the project's files.
mkdir sexual_content_classification_api
  • Let's navigate to the folder we just created and create some files.
cd sexual_content_classification_api
touch Dockerfile
  • Open the project's root directory with your favorite code editor.

Creating the Flask API

  • Open the file in your code editor.
  • Let's code our prediction and health check routes.
import requests
import uuid
import os
from flask import Flask, request
from open_nsfw_python3 import NSFWClassifier

__name__ = 'sexual_content_classification_api'
app = Flask(__name__)
classifier = NSFWClassifier()

@app.route('/health', methods=['GET'])
def health():
    return {
        "status": "OK"
    }, 200

@app.route('/classify', methods=['GET'])
def classify_image():
        url = request.json['image']
        print('Downloading the image: {}'.format(url))
        r = requests.get(url, allow_redirects=True)
        hash = str(uuid.uuid4())
        open(hash, 'wb').write(r.content)
        score = classifier.get_score(hash)
        return {
            "score": score
        }, 200
    except Exception as err:
        return str(err), 400

Creating the Docker Environment

  • Let's implement our Dockerfile to install the required python modules and run the application.
FROM python:3.7.4
COPY ./ ./
RUN pip install open-nsfw-python3==0.0.5
RUN pip install uuid==1.30
RUN pip install requests==2.22.0
RUN pip install flask==1.1.1
RUN apt update && apt install caffe-cpu --yes

ENV PYTHONPATH=/usr/lib/python3/dist-packages:

CMD flask run -h -p 80
  • Building the docker image.
Scroll to Continue

Read More From Owlcation

docker build -t sexual_content_classification_api:latest .
  • Starting a container on the port 80 of your local machine.
docker run -t -p 80:80 sexual_content_classification_api:latest
  • The API should be running and ready to receive requests.

Testing Our API

  • Testing if the API is online. I'm using curl here, but you're free to use your favorite HTTP client.
curl localhost/health
  • Expected response:
  • Testing the classification route.
curl -X GET localhost/classify -H 'Content-Type: application/json' -d '{"image":""}'
  • Expected response:
  • The score attribute in the response object is a guessing rate from 0 to 1, where 0 is equal to no sexual content, and 1 is equal to sexual content.

That's all folks! I hope you enjoyed this article, please let me know if you have some doubt.

You can get the source code of this article at the following link:

© 2019 Danilo Oliveira

Related Articles