Page 28 - MSDN Magazine, May 2017
P. 28
1 Bot
Phone
5
3 2
4
Moderator will create a review and populate the ReviewId field with an identifier. The image will then end up in the Review UI for your review team (see Figure 9).
The review tool and its use would benefit from a bit of explanation. The tool is designed for handling large volumes of images. A reviewer looks at all the pictures on a screen, tags the ones that don’t pass muster and then moves to the next screen. The tool gives the reviewer a few seconds to go back in case he thinks he made a mistake. After those few seconds, Content Moderator saves the images with the final tags and calls the callback function we specified again, now with the final judgement. We can now take appropriate action—either taking down the content or publishing it based on our business requirements. The second call back will look like what’s shown in Figure 10.
The CallBackType is now Review instead of Job and you can see the added ReviewerResultTags, while ContentId and ReviewId match the results from the first callback.
Custom Workflows
Now that we have a good understanding of the default workflow, we can start turning some knobs and dials. For Butterfly, we want to allow everything with a racy score less than 0.7, but block any- thing with a racy score higher than 0.9. For anything in between, we want the review team to take a second look. Therefore, in the workflow editor, we’ll create a new workflow.
You’ll see that there are lots of options in the dropdowns for Connect to. These options allow you to build advanced workflows
Figure 7 Calling the Review Job API
Workflow Logic
Figure 6 The Butterfly Bot Working with a Content Moderator Workflow
content with the low scores and block the content when it’s clearly too racy or adult. In the gray area in between, we want the content to go to the review tool for human moderators to inspect (Step 4). Our team of reviewers can then decide how to deal with the con- tent. When they’re done reviewing, the Content Moderator will call our bot’s app service back to share the result. At that point, the bot can take down the content if it has been flagged as offensive. Note the flexibility here. You can adjust the scores in your workflow and the reviewers can decide what’s appropriate for your specific app.
To get started, you’ll need to sign up for the Content Moderator review tool at bit.ly/2n8XUB6. You can sign up with your Microsoft account or create a local account. Next, the site asks you to create a review team, whose purpose is to review gray-area content. You can create multiple sub teams and create workflows that assign reviews to different sub teams. In the credentials tab of the portal’s Settings page, you can link up your Content Moderator settings with the Azure Cognitive Services resource you created previously. Just copy the Key and Resource ID from the Azure portal to the Subscription Key and Resource ID settings in the Moderator UI. When you first create your account, you get an auto-configured “default” workflow. As you can see in the Review UI, this work- flow will create a human review if an image is found to be adult. Let’s start by using this workflow in the Review API’s Job operation.
To call the Review Job API, you use the code shown in Figure 7.
Note that the URL contains the team name Butterfly and the postfix jobs. In CallBackEndpoint we specify the REST endpoint that Content Moderator will call to notify the review results. We also specify a unique ContentId so we can correlate the image when Content Moderator calls us back and we send the actual image URL in ContentValue. When the call succeeds, the body of the result doesn’t contain any Content Moderator result. Instead, it returns the JobId:
{"JobId":"2017035c6c5f19bfa543f09ddfca927366dfb7"}
You’ll get the result through the callback you specify in Call- BackEndpoint. This result will again have the JobId, potentially a ReviewId, and a ContentId so you can cross reference it. For the default workflow, Content Moderator will call back immediately with the result in Metadata if the image isn’t considered adult. The actual JSON will look similar to what’s shown in Figure 8.
The status for this Job is set to Complete and the CallbackType is Job. If, however, the image is considered adult material, Content
Figure 8 Default Workflow Results
var url = 'https://westus.api.cognitive.microsoft.com/contentmoderator/ review/v1.0/teams/butterfly/jobs';
var req = unirest.post(url) .type("application/json") .query({
ContentType: 'Image',
ContentId: '67c21785-fb0a-4676-acf6-ccba82776f9a', WorkflowName: 'default',
CallBackEndpoint: 'http://butterfly.azure.com/review'
}) .headers({
"Ocp-Apim-Subscription-Key": <ocp_key> })
.send({
"ContentValue": pictureUrl
})
.end(function (res) {
return callback(res.error, res.body ); });
{
"JobId": "2017035c6c5f19bfa543f09ddfca927366dfb7", "ReviewId": "",
"WorkFlowId": "default",
"Status": "Complete",
"ContentType": "Image",
"CallBackType": "Job",
"ContentId": "67c21785-fb0a-4676-acf6-ccba82776f9a", "Metadata": {
"adultscore": "0.465", "isadult": "False", "racyscore": "0.854", "isracy": "True"
} }
24 msdn magazine
Cognitive Services
Review UI