Page 24 - MSDN Magazine, May 2017
P. 24
COGNITIVE SERVICES
Protect Web Apps
Using Microsoft
Content Moderator
Maarten van de Bospoort and Sanjeev Jagtap
Every day billions of users take pictures and videos to share on social media. As anyone who’s dealt with user-generated content on the Internet knows, the net’s anonymity doesn’t neces- sarily surface the prettiest human behavior.
Another important recent trend is the proliferation of chat bots. Not a day goes by without the release of a new set of bots that can do everything from booking travel to coordinating your meetings toonlinebanking.Whilethosebotsareusefulwithoutadoubt,the killer chatbot is still elusive: the one bot that all the messaging plat- forms want in order to crack the goal of 1 billion daily active users.
Now let’s imagine you created just that: Butterfly, the bot that everyone feels compelled to engage with. Users can share media with your bot and through your secret machine learning algorithm,
the bot will predict their future for the next 24 hours. After a year of hard work, you released your bot. Overnight, Butterfly went viral. Unfortunately, your startup dream quickly turns into a public relations nightmare. Users are submitting adult and racy content, which is then shared and publicly available to other bot users. And some of the content is bad. Really bad. Users are suing you; the phone is ringing off the hook; and you receive threats that your Webservicewillbeshutdown.Youneedapowerfulsolutionto help detect and prevent the bad content from being visible to other users. And you need it quick.
That’s where Microsoft Content Moderator comes to the rescue!
In this article, we’ll show you how Content Moderator can help. We’ll start by creating a chatbot using the Microsoft Bot Framework, but keep in mind that the information applies equally to any Web or client application. Butterfly will enable end users to share text, images and videos, and will use Content Moderator to filter out the inappropriate material before it gets published. Along the way, you’ll learn how to configure custom Content Moderator workflows and to adjust the thresholds for the content classifiers. We’ll also discuss the different connectors that can be used in the workflow, such as Text and Child Exploitation. Let’s start with an overview of content moderation.
Content Moderation
Microsoft has a long and proven track record combatting digital crime. The Microsoft Digital Crimes Unit works hard to take down botnets, limit tech support fraud, thwart phishing schemes and more. One less-visible active area is how the unit assists law
This article discusses:
• How content moderation works
• Creating a bot to let users share images
• Using Content Moderator to evaluate image content
• UsingtheModeratorReviewAPItoenablehumanreviewofcontent
• Creating custom workflows
Technologies discussed:
Microsoft Content Moderator, Microsoft Bot Framework, Moderator Review API
Code download available at:
msdn.com/magazine/0517magcode
20 msdn magazine