Sightengine's Image Moderation API allows developers to moderate and filter user-generated photos for adult content, validation, and photo type. The Sightengine Block allows you to execute image moderation on your data streams in-motion. For example, if your app allows users to publish photos, you can analyze and filter them before they're published to end users — ideal for chat apps, user forums, social media aggregators, and more.
Some Blocks require third-party service signup
When Blocks integrate with third-party services, such as Amazon, Librato, MapBox, or RingCentral, the services often require account setup to use them. Please make sure to visit each Block's documentation page, located in the Blocks Catalog, to learn what is required.
Sightengine Function Code
Setting up the Function
Create a new function. Go to your PubNub Dashboard and create a new module, and then create a new Before Publish function. The function should be set up to trigger on a specific set of channels (such as
chat.*) or on all channels using wildcards (
Copy the function code from above. You'll need to update the API key for the service from your account.
Click Start module to start the function, and test it using the Test Payload field and Publish button on the left.
Testing the Function
Input: Publish a JSON message containing image URL on input channel.
Output: Messages with nudity detection data are received on the same channel.