While you hear more buzz about IoT, the internet of things requires machine learning and AI. And, while much of the attention in the IoT world is drawn toward the explosion of software in everyday devices, another technology revolution is happening that might be even more far reaching in impact. The cloud itself is changing from a place where data is collected and stored to a place where it is being interpreted and understood through the power of machine learning.
According to PwC, Artificial Intelligence will contribute $15.7 trillion to the global economy by 2030. So, businesses can reap huge benefits from investing in AI. The MIT Sloan Management Review’s 2017 Artificial Intelligence Global Executive Study and Research Project found that 85% of executives believe that AI will help their businesses gain or sustain competitive advantage.
Throughout the history of computing, the fundamental goal has been data processing. This involved carefully collecting, curating and storing data to be processed by machines. It was first achieved through punched cards, then through tapes and disk drives, and today on the cloud. We used analytics tools (root word: analyze) as if these tools did any analysis at all. In fact, they were just tools to slice and dice the data and present it in ever more increasingly sophisticated ways for some human to figure out what it all means.
When cavemen wanted to learn how to make fire, they didn’t check out a library book or take a college course. Instead they watched someone start a fire, then tried it themselves…and failed. They were then corrected and tried again until they got it right. Learning through trying and repeated failure is the only way humans have ever learned anything. This explains the popularity of YouTube, as it’s mostly made up of videos of people doing things for you to observe and maybe learn from.
Curiously, though, this is not how we’ve been using computers — until now. The concept of machine learning is to teach a computer to interpret data by asking it to create its own model. Of course, it will fail at first, and after correction by a human operator, will attempt again. After multiple iterative failures and retraining, the computer can create a model for data interpretation at a high speed with a high rate of success. This is a new way of learning, and we are only just beginning to understand its impact.
Understand that machine learning is distinctly different from artificial intelligence, as it’s not really intelligence, and there’s nothing artificial about it.
It’s possible that machine learning might have to exist simply to deal with the crush of data coming at an ever-increasing rate from everyday devices. Each time a new smart device is shipped, the surface area of data collection increases; if there ever was a time where there was enough storage to capture it all, that time has passed. It now makes no economic sense to capture every data point, as we need only selected data points to train models. This computer assist of humans through machine learning is almost required to sift through the data from the Internet of Things.