Role of AI in Real-Time Analytics

By Jyoti Nigania |Email | Aug 24, 2018 | 13905 Views

Significantly every industry is finding a way to benefit from AI-driven analytics. While great strides have been made in the adoption of real-time analytics in the marketplace, artificial intelligence could ramp this up. We have analyzed along the way with analytics in recent years, in which data is applied against algorithms or analytics engines to determine what it may mean to the business. Lately, there's been a lot of progress with real-time analytics, especially when applied against streaming data from systems or devices. 

That's the word from a group of McKinsey Global Institute analysts, led by Michael Chui, who connected the dots between AI and hundreds of use cases from across 20 industries in a recent study. Notably, they observe, the most value coming from AI, as indicated by more than two-thirds of projects studied 69%, are in improving the performance of existing analytics efforts. For purposes of clarity, the analysts define AI as deep learning techniques using artificial neural networks. 

It's significant that every industry is finding a way to benefit from AI-driven analytics because the potential case studies vary considerably. A manufacturer may be concerned with syncing its production-floor machines with its supply chain, while a retailer may want to know what customers are using which channels, and healthcare establishment may be concentrating on better ways to track patients' vital signs remotely. When cognitive computing technologies such as AI are applied to enhance real-time analytics, the innovation explodes. Chui and his McKinsey team describe the following key applications arising from the intersection of AI and analytics:

  • Predictive Analysis:  AI is being trained to notice or identify extensive range of inconsistencies. Deep learning's capacity to analyze very huge amounts of high dimensional data which can take existing preventive maintenance systems to the next level, according to Chui and his co-researchers. Layering in additional data such as audio and image data and from other sensors includes relatively cheap ones such as microphones and cameras neural networks can enhance and possibly replace more traditional methods. AI's ability to predict failures and allow planned interventions can be used to reduce downtime and operating costs while improving production yield.

  • AI-driven logistics optimization:  According to the McKinsey team, AI can reduce costs through real-time forecasts and behavioral coaching. Application of AI techniques such as continuous estimation to logistics can add substantial value across sectors. AI can enhance routing of delivery traffic, thereby improving fuel efficiency and reducing delivery times. One European trucking company has reduced fuel costs by 15 percent, for example, by using sensors that monitor both vehicle performance and driver behavior drivers receive real-time coaching, including when to speed up or slow down, optimizing fuel consumption and reducing maintenance costs.

  • Assist in customer service management: Improved speech recognition in call center management and call routing as a result of the application of AI techniques allow a more seamless experience for customers and more efficient processing, Chui and his co-authors state. For example, deep learning analysis of audio allows systems to assess a customers' emotional tone in the event that a customer is responding badly to the system, the call can be rerouted automatically to human operators and managers.
As AI takes real-time analytics to a whole new level, there are new types of requirements as well. For starters, Chui and his colleagues point out, data requirements for deep learning are substantially greater than for other analytics in terms of both volume and variety. It often is built upon thousands of data records that enable data models to become relatively good at classification tasks and, in some cases, millions for them to perform at the level of humans. By one estimate, a supervised deep-learning algorithm will generally achieve acceptable performance with around 5,000 labeled examples per category and will match or exceed human level performance when trained with a data set containing at least 10 million labeled examples. Perhaps, with the ever-expanding Internet of Things, with the terabytes and petabytes' worth of data streaming in from various devices, systems, and applications, this threshold can be achieved.

Source: HOB