Photo: Getty |
The hype machine is cranked up to an 11
on the topic of machine learning (sometimes called artificial
intelligence, though I don't call it that because AI is not really
intelligence and there’s nothing artificial about it). Machine learning
will either empower the world or take it over, depending on what you
read. But before you get swept away by the gust of hot air coming from
the technology industry, it’s important to pause in order to put things
into perspective. Maybe just explaining it in reasonable terms will
help.
Shortly after the first caveman figured out how to make fire, the second caveman wanted to learn how to make fire, too. However, he didn’t -- and couldn't -- check out a book from the local library or take a three-credit college class. Instead, he watched caveman one make fire, tried to do it himself, failed, was corrected and then did it again until he got it right. Fundamentally, this is how all humans have ever learned anything -- by watching, trying, failing, correcting and repeating.
Think about this from a modern perspective. If you were to drop your phone and crack the screen, you would probably go directly to YouTube and search “how to replace an iPhone screen.” After watching the video, if you felt like it might be within your capability, you might go to Amazon and order a replacement screen kit (though you should probably order two because you are going to mess up one of them.) When the box arrives, you go back to YouTube and watch the same video and try to match what the person onscreen is doing. If you succeed, you’ve now completed a very technical task that you never learned in school or took a class to do. And you probably didn't think you’d be doing said task when you got up that morning...
What we are doing is teaching computers to learn the same way we do. We send sets of data to very powerful machine learning software that is built into cloud platforms like Google and IBM. Now, we ask the machine to figure out what the patterns are and what the data means.
Of course, the machine gets it wrong, but then the task is to correct the model and do it again. After multiple iterations, the model becomes better and better, almost like a pixelated photograph becomes sharper each time more data is sent to fill it in.
Read more...
Source: Forbes