The Ultimate Survival Guide on AI and Machine Learning in the Fourth Industrial Revolution, Part II

Part 2: The promise of industrial AI and machine learning.

In this five-part series on industrial AI, we will arm you with the knowledge you need to understand what AI, machine learning, data science and edge computing are, and how they are already impacting industry. See Part 1, where we covered factors contributing to digital disruption in industry.


We are inundated with messages of how AI will change the future of industry as we know it. But what exactly is AI, and how does it actually help industry? Here it is in a nutshell: it takes the growing amount of data being generated and turns it into useful insights that drive better business decisions.

The amount of data produced in our connected world is growing at a rapid pace. By 2020, a city of 1 million people will generate 200 million gigabytes of data every single day. The volume of all data stored globally is predicted to rise from 1.4 zettabytes to 6.2 zettabytes over the next two years. As more and more industrial companies become digitized, the ability to learn from and act upon available data will become table stakes in order to survive. Businesses will need to leverage emerging technologies like machine learning, data science, and edge computing to make sense of this information quickly and accurately.

Machine Learning 101

AI and machine learning are often used interchangeably. This is misleading because machine learning is an important subset of AI. Today’s advancements in AI are largely being driven by the use of machine learning to process massive amounts of data.

AI vs. Machine Learning

  • AI is a broader term used to refer to the ability of machines to perform tasks that typically call for human cognitive functions. The English Oxford Living Dictionary defines AI as “the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making and translation between languages.”
  • Machine learning is the use of statistical computing to understand tendencies, patterns, characteristics, attributes and structure in data, to inform decisions with new observations. It is a computer science field that gives systems the ability to learn from data to improve the performance of specific tasks, without being explicitly programmed to do so.

Do you ever find yourself with more items in your Amazon shopping cart than you originally needed? Amazon’s product-recommendation feature uses machine learning to help improve your online shopping experience. Amazon takes data about your purchase history and browsing patterns and compares that information to the behavior of other shoppers to automatically recommend other items you might like to buy.

Industrial machine learning is used to detect patterns in the health of critical assets — heavy machinery, equipment and components — to predict when they will break down or require maintenance. For example, in the rail industry, railroads use machine learning to detect and prevent catastrophic failures and damages before they happen. Better-informed maintenance decisions can be made more quickly across the entire rail value chain.

How Does Machine Learning Work in Industry?

Consumer IoT products compute data in a centralized cloud to enhance experiences for their customers. Netflix, for example, uses machine learning algorithms to recommend additional video content to viewers. In fact, 75 percent of all Netflix viewing is driven by the company’s recommendation algorithm. Thanks to cloud computing, Netflix can easily process massive amounts of data from all of their 117.58 million customers. But what about when your data is not centralized? What happens when you are taking vast amounts of data, not from your smartphone or TV, but from decentralized industrial equipment located around the globe?

Industrial Machine Learning Extends to the Edge

In industrial sectors, there are unique challenges when it comes to the data being generated by connected machines. Each piece of equipment has its unique constraints. For example, locomotives transporting precious metals in the outback of Australia have connectivity issues. They travel in and out of network coverage, and though it is possible to intermittently deliver data to and receive insights from the cloud, it is very expensive, impractical and ineffective on the front lines.

In some cases, the amount of data being generated is so massive that there is no other choice but to perform some level of computing at the site of the machine. An offshore oil rig, for example, generates 1-2 terabytes of data daily. It would not make sense to immediately send all of that raw data up to the cloud without at least analyzing it onsite for potential value first.

Industrial data is often too remote, too massive or too time-critical to rely on only cloud computing — it must be augmented with edge computing.

What is Edge Computing?

Edge computing is data collection, processing and analysis that happens at the edge of the network — near the actual source of the data, often by necessity using a lower-cost, power-efficient computer suitable for industrial environments. This enables businesses with limited or no network connection to gain real-time data insights and take immediate action.

Edge-deployed data science is still in the early stages of development. It requires immense skill and careful collaboration with industry experts and device manufacturers. But by using the powerful combination of on-device embedded computing capabilities and various edge computing patterns, businesses can make the most of both cloud and edge computing, improving the overall effectiveness of their operations and data processing.

Imagine if you could know when a critical piece of machinery will break as easy as you know when you need to reorder shampoo on Amazon? This is the promise of AI and machine learning in industry.

In part three, we’ll discuss how data science fits into the mix.

Don’t want to wait? Download A Survival Guide to AI and Machine Learning in the Fourth Industrial Revolution today.