http://oncomp.com/about/ |
TeachableMachine.withGoogle.com |
Anything that's controlled by a program is a robot. So most robots aren't visible in the ordinary sense of the word, they're just part of the system.
A machine can respond to body movements, facial gestures, or just leaving the scene. The very fact that you're not there can be an instruction. Bob recalls talking to an engineer who had just come from an automated factory in Kentucky. It made small electric motors, like the kind in vacuum cleaners and sewing machines. The factory floor was all dark. You could just see red and blue tiny lights blinking. After all, the machines didn't need light, why waste money on lighting?
This kind of thing is increasing and there's a joke about it that goes like this: The factory of the future will only need two attendants: a man and a dog. The man is there to feed the dog, and the dog is there to bark when the man falls asleep.
So Google is going to teach us how to operate machinery without even going to the factory. The dog will be out of a job. Go to TeachableMachine.withGoogle.com. They'll teach you how to do it. No charge. It's a great introduction to "machine learning," which is already a hot career field.
Machine learning powers supercomputers such as IBM's Watson. It's used in facial recognition software and you see it in photo apps. Other areas include robotics and medicine. Then there's Siri, Alexa and Google Home, or any system that uses voice recognition. It's all machine learning.
To create your own machine learning demo, you don't need to install anything. Just go to "Google's Teachable Machine" and start. You will need a computer that has a built-in camera, which most do now, or you can buy a clip-on camera for older machines; they're cheap.
In less than a minute, we trained Google's website to show a photo, play a sound, or talk to us. When Joy moved her arm across her computer screen, she got a white cat waving its paws. When she was still, she got a fluffy Pomeranian dog. When she pulled her bangs back, a rabbit appeared. We played the air guitar to get a music clip, drummed on the desk to do a drum solo and put a thumb to our lips to get a trombone.
Now all of these results are pretty useless, but their point is they could just as easily be a result that produced some other action or even many actions. But we had to "train" the machine first. That meant holding down the "train" button on the website while waving our arms or whatever gesture we wanted to associate with a photo, a music clip or a voice. We made the voice say "Way to Go!" when we flexed our muscles.
As an aside here, it struck Bob that this could be an immediate response to danger. When the bad guys come in pointing a gun at you, just throw up your hands in surrender and that action will immediately call the police, seal the doors, release the sleeping gas or whatever.
If you want to go further, there's Wekinator.org. It offers a free online course, Machine Learning for Musicians and Artists, from Goldsmith University in London. (It's $20 a month if you want college credit.) It's quite techie, but it might just launch a new career. They say it's the only such class oriented toward art and music.
Read more...
Source: Arkansas Online