It is difficult to remember now, but there was a moment in early 2016 when many in the tech industry believed chatbots - automated text-based virtual assistants - would be the next big platform. Messaging app Kik staked its company's future on bots and "chatvertising." Startup studio Betaworks launched an accelerator program called Botcamp. And at its 2016 F8 conference, Facebook pitched bots to developers as the best way to connect with 900 million Messenger users.
Few expected that voice assistants like Amazon's Alexa and Google Assistant would thrive and text-based chatbots would become a punchline. Betaworks' accelerator lasted one class. Kik pivoted to blockchain technology. And now Facebook says it will shutter M, its buzzy full-service virtual assistant, on Jan. 19.
In some respects it's impressive that Facebook kept M running as long as it did. Despite the hype, M, which lived in Facebook Messenger, was presented as an experiment. The free service was only offered to 10,000 people in the San Francisco area, who used it to do things like book restaurant reservations, change flights, send gifts, and wait on hold with customer service. For those that had access, M was a fantastic perk. But for Facebook, it was a cost center.
That's because most of the tasks fulfilled by M required people. Facebook's goal with M was to develop artificial-intelligence technology that could automate almost all of M's tasks. But despite Facebook's vast engineering resources, M fell short: One source familiar with the program estimates M never surpassed 30 percent automation. Last spring, M's leaders admitted the problems they were trying to solve were more difficult than they'd initially realized.
It was easy for M's leaders to win internal support and resources for the project in 2015, when chatbots felt novel and full of possibility. But as it became clear that M would always require a sizable workforce of expensive humans, the idea of expanding the service to a broader audience became less viable.
M's core problem: Facebook put no bounds on what M could be asked to do. Alexa has proven adept at handling a narrower range of questions, many tied to facts, or Amazon's core strength in shopping.
Another challenge: When M could complete tasks, users asked for progressively harder tasks. A fully automated M would have to do things far beyond the capabilities of existing machine learning technology. Today's best algorithms are a long way from being able to really understand all the nuances of natural language.
Facebook did succeed in automating some of the work its army of contractors used to perform in the guise of M. If you ask the bot to get flowers delivered, it can automatically get suggestions from online florists, only asking a human to choose which quotes to present to the user.
Facebook is not left entirely empty-handed. The people who used the service and role-played as the omniscient assistant have generated valuable data that can be used by the company's AI researchers. Using machine learning to make software better at understanding natural language and conversation is one of the group's primary interests.
"We launched this project to learn what people needed and expected of an assistant, and we learned a lot," Facebook said in a statement. "We're taking these useful insights to power other AI projects at Facebook. We continue to be very pleased with the performance of M suggestions in Messenger, powered by our learnings from this experiment."