Amazon's head of A.I. for its AWS cloud computing outfit, Matt Wood, sits down for a talk about how the company is popularizing machine learning and related tasks, and where the technology is headed in coming years.
Amazon (AMZN) today kicked off its user conference for its AWS cloud computing service, "AWS Summit
," in New York, and I stopped by the Javitz convention center where it was being held to meet with one of the keynote speakers, Matt Wood
, who runs the artificial intelligence
operations for AWS.
Wood started early with AWS -- he joined in 2010 and was employee number three at its European offices -- a business that was started by Amazon in 2007 and that today accounts for billions in revenue and is one of the prime drivers of the company's stock valuation.
Wood had a bunch of things to say about new product, things such as "Macie," which can scan the collections of files you have on AWS and let you know if any sensitive data has been exposed to the outside world, such as customer social security numbers or credit card numbers. It can also automatically yank that data without human intervention. The point is that, as with all machine learning, Macie figures out where the risks are of data exposure on its own, without having to be fed an elaborate set of security rules.
One of the things most interesting to me is that I've not heard as much about A.I. in relation to AWS. It seems like Alphabet's (GOOGL) Google unit, and IBM (IBM), and Microsoft (MSFT) and Facebook (FB) and Baidu (BIDU) - just about every other tech giant gets more press about A.I. and machine learning advances.
Why does Amazon have such low visibility on A.I.?, I asked.
Wood, who speaks with a with a crisp British accent, is unfazed by the lower profile, telling me "We tend not to thump our chests about it."
Companies have different ways of looking at things, and some are "product-focused," as he describes Google and others, without mentioning them by name. "We are much more interested in enabling others; we are unusually customer-focused," he says.
Despite the lack of notoriety, "inside Amazon we've been doing machine learning for over twenty years," he notes, and anyway, "We have more machine learning running on the platform than anywhere else" he claims, meaning AWS is doing more A.I. than Google or any other facility in the world. Wood can rattle off names of customers using machine learnings, such as consumer health software provider ZocDoc, which has developed a machine vision application with it; and InstaCart, the grocery delivery service that uses it to know where produce is at various local stores; and Pinterest, the photo-sharing service, that uses it to classify images; and StitchFix, the fashion startup that Wood says uses it "to predict the next fashion trend."
Wood got a medical degree and might have stayed working in hospitals but for the fact that it dawned on him "I could, at best, only perhaps reach maybe help tens of thousands of people," whereas with software technologies, he saw a potential to have a greater impact in the world.
Along the way, he also got a PhD in machine learning, and he worked "at the tail end" of the Human Genome Project, he says, in research that included understanding protein folding, a task for which machine learning can help a lot. He seems especially proud of the fact that there is "tons of genomics today running on AWS, a lot of analysis happens there."
The Amazon mission with respect to A.I. is the same as the overall AWS mission, he says, namely, "to put tech in the hands of as many developers as possible."
"We're taking the exact same approach with machine learning as wen took with Web services."
AWS offers three "tiers" of machine learning, ranging from the very sophisticated "deep learning" facilities, that are "very technical" and meant for scientists; to tools that let "customers build their own data models"; to a set of application interfaces (APIs) that are "just a black box so the dev doesn't have to know anything about speech recognition" or so they can add "chat bots" to their applications, he says.
"I think about Web sites five to ten years ago," he muses, "it was quite a task to build a Web site, but today, anyone can build one. And so people have built these entirely new categories of applications, like AirBnB."
Likewise, "Today, machine learning is very technical," he says, but overtime, and with Amazon's help, it is going to be simpler and simpler to apply machine learning to any number of different applications, "and to do it with high accuracy."
Wood noted another important development, the shift from just the "training" phase of A.I., where a computer deduces patterns, to the "inference" stage, where it responds to user requests based on what it's learned.
In particular, he notes the increase in "inference at the edge," meaning, in client computing devices and other things that are not inside the data center. One example is the "Echo" smart home device, which uses a "simplified voice model" to hear when you say the "wake word," calling its name "Alexa," from across the room. "That's a form of inference there on the device, we don't do anything with the audio until we detect the wake word, and then handover is given to the cloud," he observes.
Another "edge" example is Amazon's "PrimeAir," the drone-based delivery service for Amazon customers. It's an example of autonomous vehicles using the principles of machine learning while untethered from the cloud.
I wrapped up the conversation asking Wood what he thinks of machines making machines, meaning, machine learning being able to design new algorithms for machine learning, a kind of self-reflexive moment in A.I. "Absolutely," says Wood, "It's already happening. There are customers on AWS who are training bots to to make algorithms." One example is something called Bandits, where machines face off against one another, with one machine trying to deduce learning models while the other is trying to trick it with falsehoods.
Amazon shares today are up $14.14, or 1.5%, at $982.13, and the stock is up 31% this year.