Real-time Image Moderation for Chat with Sightengine

2 min read Michael Carroll on Nov 17, 2016

Sightengine provides a simple API to analyze, moderate, and filter an image for nudity, profile image validation, or type check. Sightengine could be integrated to monitor a discussion group or forum, and filter out user-generated photos that break code of conduct instantly. And now with the Sightengine PubNub Block for image moderation, you can scan and filter images in a real-time chat application or casual game, any app where users are sending and receiving images in real time.

Getting Started with Image Moderation

Getting started with Sightengine and BLOCKS couldn’t be easier. Head over to the BLOCKS catalog and click ‘Try it Now’. Create your free PubNub account first if you don’t already have one. Follow the prompts to create a copy of the Sightengine Block. Next you’ll need to get and write down your API user and API secret values.

Scroll down to where it says API key in the block code, and put in your API secret value. Now start the block. You can use the existing test payload or change it to your own. Just set the ‘image’ value to the url of an image you want to test. For example:

   "image": ""

Now press the publish button to see the result. You will get something back like this:

   "nudity": { 
       "raw": 0.034964, 
       "partial": 0.043041, 
       "safe": 0.921994 
   "media": {

And that’s it!

With Sightengine you can scan for unsafe images in any message with their simple API. Beyond nudity detection, you can detect faces and people or determine if an image is a natural scene or an illustration, great functionality for any real-time application that involves user generation and submission of images.