AI Case Study
Spirit AI attempts to make gaming communities more inclusive by using natural language understanding and machine learning to monitor conversations and detect cyber bullying
Spirit AI has launched a natural language understanding and machine learning software called Ally which monitors online conversations in gaming communities and chat rooms to detect abusive behaviour. Ally predicts which problems are going to escalate.
Consumer Goods And Services
Entertainment And Sports
"Ally considers context, nuance and the relationships between users versus seeking and blocking keywords. The software uses natural language understanding and AI to identify the intent of a message, and then analyzes the behavior and reactions to determine its impact. It can do things like identify cyberbullying.
The system takes a player-centric approach to understand nuances in interactions. Friends can use playful taunts and language, but when it’s a stranger, the context is totally changed. With this combination of both player and community preferences, Ally can take quick action when needed.
Ally investigates abuse incidents further through a multi-stage triage process of analysis. Ally looks at the deeper context of the whole conversation and of the history of interactions between the players. Ally highlights repeat perpetrators who display similar abusive behaviours on multiple occasions automatically compile a dossier of evidence for your safeguarding team to act upon."
POC; Results not yet available
"Language and behavioral analysis"
"Interaction patterns analysed by machine learning, combined with natural-language classifiers, rather than relying on a list of individual keywords"
According to BBC, Ditch the Label, an anti-bullying charity conducted a survey among gamers, 47% of whom said they had been threatened in an online game and 74% said they would like the issue to be taken more seriously.
Online conversations, Chatrooms, Social media