Industry's 'Artificial Intelligence' would technically be called 'Deep Learning.'
Academia doesn't consider artificial intelligence (AI) and machine learning (ML) interchangeable, and as an ex-academic, I sympathize with their definitions and agree that technically AI is a proper superset of ML which is a proper superset of deep learning. Deep learning
(DL) is ML that uses a particular class of algorithms (neural networks
) and it's what industry tends to mean when it says AI. How's that for proper?
But I also think that most people (and industry) don't particularly care about that distinction and will use language in a less formal way. Language evolves, whether we like it or not. Originally used by professorial types, the term AI has fled out of their clutches and into the common vernacular as something else.
Language evolves, whether we like it or not.
At the risk of offending researchers, I feel it's most helpful to acknowledge the new way industry is using the term and explain the modal usage for readers who aren't interested in nuance. It's okay to let the English language evolve as long as we keep up. Coined in 1956, the term AI was never defined all that strictly. (Right, academics? Remember the days when AI was an embarrassing term
to use on your grant applications‚?¶ so you just replaced all instances of it with machine learning?) With poorly defined terms, there's not really such a thing as using them correctly. We can all be winners. The words move around.
And watch out definitions lawyers: wouldn't it be embarrassing if what you're calling AI is actually technically called reinforcement learning
(RL) and you're also misusing your words? Here, have a hug, we can all be friends. If your definition hinges on sequences of actions, planning, gathering information from an environment, figuring out policies for behaving in the future - a classic example is a computer learning to do stunt maneuvers with toy helicopters
- then you might be thinking of RL.
Human-like intelligence (HLI) is a better term for the sci-fi version.
If you're drowning in all this alphabet soup of AI, ML, DL, RL, while looking desperately for the robot entities of sci-fi, then you might like the term HLI. Human-like intelligence. If you're going to be referring to 'an AI' in a way that evokes personhood, it's probably best to call it ' an HLI' instead. Those of you who are worried that there's HLI lurking in every cupboard can breathe easy; all those industries AI applications are not HLI and aren't about building actual minds. Everyone's too busy using AI to solve real business problems that involve some solid unglamorous thing-labeling.
Let's summarize. Close your ears, professors. Everyone else: when you hear them talked about in the industry, AI and machine learning may as well be synonyms and they have little to do with HLI.
In practice, you don't have to classify your problem as AI or ML before you begin.
Here's another reason why a practical cat like me can live with that: from an applied process standpoint, then you needn't classify your business problem as AI/ML/DL before you begin
. Just try all the algorithms you can and iterate towards the better performer. If non-DL ML is the wrong approach, you'll find out quickly and course correct. But it's usually best to try the simpler option even if you doubt it'll work. It only takes a few minutes. If you can fit a neural network to it, you can try putting a line through it too. (It's what, 2‚??5 lines of code? Even if you're not using a package and implementing from scratch, it's easy. If you forgot the formula for regression, my friends from grad school made a catchy song about it
for you so it sticks.) As a bonus, if the simple thing performs well, that means you got yourself a solution that will be easier to maintain in production. Good luck, have fun, and may the best algorithm win!
The article was originally published here