I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing ...
I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing
In a sign that deep learning neural networks are going mainstream, computing and software giants IBM Corp. and SAP SE today announced separate initiatives aimed at making it easier for large enterprises to use deep learning in their operations.
Deep learning neural networks attempt to emulate how the brain learns, allowing computers to learn partly on their own rather than being explicitly programmed. The approach is responsible for big advances in recent years in speech and image recognition, making possible everything from smart digital assistants such as Amazon.com Inc.'s Alexa to self-driving cars.
But to date, most deep learning devotees have been the large internet companies such as Google Inc., Facebook Inc. and Microsoft Corp. Not very many large enterprises have used the technology extensively in their own core applications yet.
"Deep learning has become interesting to the enterprise," Sumit Gupta, vice president of high-performance computing and machine learning at IBM, said in an interview. "All the interest in artificial intelligence has been driven by deep learning."
The opportunity is immense. According to a 2016 report from market intelligence firm Tractica LLC, annual deep learning software revenue worldwide will jump from $109 million in 2015 to $10.4 billion in 2024, representing cumulative spending in the enterprise market of $40.6 billion through 2024.
IBM and SAP are far from alone in leveraging deep learning. Google, Facebook, Microsoft, Salesforce.com Inc. and others have been using the neural networks for years to buff up their own services and enable new ones. But now, enterprises are looking to get the benefits in their own applications, and that's what IBM and SAP, as well as cloud providers such as Google and Microsoft, are looking to provide.
Today, IBM announced that it's combining two sets of software tools that already are aimed at making it easier for enterprises to apply AI technologies to their operations. IBM's Data Science Experience, a collaborative workspace for data scientists to manage their data and trained machine learning models, now can be integrated with the company's PowerAI, a deep learning platform that offers a range of open-source software frameworks such as TensorFlow, Caffe and Torch that provide the ability to do image and speech recognition, natural language processing and similar tasks requiring learning over time.
The idea is to make it much easier for data scientists to use deep learning on a variety of tasks. For example, IBM said, banks could make better predictions on whether clients might default on loans or detect credit-card fraud. Manufacturers can train models with historical machine data to identify potential failures before they happen. Inventory management could be optimized based not only on point-of-sale data but also weather and other outside data. "The data scientist doesn't have to deal with provisioning or managing the server," Gupta said.
Although the Data Science Experience is available both in the cloud and in on-premises data centers, PowerAI is only on-premises. So the integrated software is available for now only in on-premises data centers, with no specific plans yet for a cloud version.
For its part, SAP today announced that it will be the first company to provide an enterprise machine learning portfolio of technologies on computer systems that use Nvidia Corp.'s latest-generation Volta graphics processing units. GPU chips, traditionally used for gaming and other graphics-intensive uses, have become a standard for deep learning because they can process the massive amounts of data needed in parallel, making them much faster than conventional processors.
Nvidia's GPUs were first installed in Nvidia DGX-1 systems in Israel and Potsdam, Germany, in 2016, and later installed in SAP's production data center in St. Leon-Rot, Germany, its innovations labs in Palo Alto, California, and in Singapore. Already, they've been powering the training of data and algorithms in SAP's Leonardo Machine learning portfolio of software.
Now, SAP is upgrading its systems in St. Leon-Rot with Nvidia's Volta systems, which Jim McHugh, vice president and general manager of Nvidia's DGX Computing, told SiliconANGLE is the first production Volta solution for the enterprise. "We've been making the benefits of deep learning and human-level intelligence available to enterprise applications users from the get-go," Markus Noga, SAP's vice president of machine learning, said in an interview.
For instance, SAP's Brand Impact, a cloud-based machine learning service that analyzes videos to detect brand logos and the extent of their exposure in film and television to determine ad effectiveness, can now provide analyses in 24 hours instead of what used to take six weeks to do manually.
SAP also has Leonardo Machine Learning services for, among other things, customer service tickets, matching of customer invoices to payments and, in prototype form, an automated accounts payable system. Noga said SAP plans to extend the services starting next year to mainstream enterprise resource planning applications such as core financials, procurement and production.
For now, the Leonardo Machine Learning portfolio on the Volta systems is being offered only in the cloud, though Noga said the company is in the process of defining an on-premises offering.