AI Case Study
Facebook shuts down its virtual assistant chabot experiment as it proves it cannot live up to the difficulty of the tasks it is asked to perform
Facebook was testing its virtual assistant Messenger chatbot, M, available to 10,000 people in the San Francisco area. The bot was designed to perform tasks such as making restaurant reservations, changing flights or waiting on hold with customer service. However, as users were able to ask the chatbot any sort of question and use it to perform any kind of task, M did not prove to be capable of progressing to automate these tasks. It has been estimated that M never surpassed 30 percent automation and thus, it was shut down.
Internet Services Consumer
"Despite the hype, M, which lived in Facebook Messenger, was presented as an experiment. The free service was only offered to 10,000 people in the San Francisco area, who used it to do things like book restaurant reservations, change flights, send gifts, and wait on hold with customer service. For those that had access, M was a fantastic perk. But for Facebook, it was a cost center.
That’s because most of the tasks fulfilled by M required people. Facebook’s goal with M was to develop artificial-intelligence technology that could automate almost all of M’s tasks. But despite Facebook’s vast engineering resources, M fell short: One source familiar with the program estimates M never surpassed 30 percent automation. Last spring, M’s leaders admitted the problems they were trying to solve were more difficult than they’d initially realized.
It was easy for M’s leaders to win internal support and resources for the project in 2015, when chatbots felt novel and full of possibility. But as it became clear that M would always require a sizable workforce of expensive humans, the idea of expanding the service to a broader audience became less viable.
M's core problem: Facebook put no bounds on what M could be asked to do. Alexa has proven adept at handling a narrower range of questions, many tied to facts, or Amazon's core strength in shopping.
Another challenge: When M could complete tasks, users asked for progressively harder tasks. A fully automated M would have to do things far beyond the capabilities of existing machine learning technology. Today's best algorithms are a long way from being able to really understand all the nuances of natural language.
Facebook did succeed in automating some of the work its army of contractors used to perform in the guise of M. If you ask the bot to get flowers delivered, it can automatically get suggestions from online florists, only asking a human to choose which quotes to present to the user.
Facebook is not left entirely empty-handed. The people who used the service and role-played as the omniscient assistant have generated valuable data that can be used by the company's AI researchers. Using machine learning to make software better at understanding natural language and conversation is one of the group's primary interests."
M could be asked to do a wide range of tasks, which proven to be unfeasible.
"Few expected that voice assistants like Amazon's Alexa and Google Assistant would thrive and text-based chatbots would become a punchline. Betaworks’ accelerator, which the company says was designed as a one-off, has moved on to other themes. Kik pivoted to blockchain technology."
Messages in Messenger, containing users' questions and requests for tasks