Photo: |
What you see when Boston Dynamics’ humanoid robot does a backflip or its Spot dog robot fights off a human and opens a door is incredible hardware engineering, to be sure. But what you don’t see is the wildly complex underlying code that makes it possible. What comes so easily to you—OK maybe not backflips, just walking—requires extreme coordination, which roboticists have to replicate, a kind of dance of motors working in concert.
Pity the engineers who have to write out all that code. Over at Google, researchers have a secret weapon to teach robots to move that’s both less taxing and more adorable: dogs. They put the canines on treadmills and take motion-capture videos, then feed that data into a simulator to create a digital version of the pooch. The researchers then translate the digital version of the real dog into a digital version of their four-legged robot—Laikago, which has a rectangular body and skinny legs. Then they port those algorithms into the physical version of Laikago. (The robot is named, by the way, after Laika, the Soviet space dog who was the first animal to orbit Earth.)
A robot works quite differently than a biological dog; it has motors instead of muscles, and in general it’s a lot stiffer. But thanks to this translation work, Laikago has learned to move like a real-life canine. Not only that, its learned gait is faster than the fastest gait provided by the manufacturer of the robot—though in fairness it’s not yet as stable. The new system could be the first steps (sorry) toward robots that learn to move not thanks to exhaustive coding, but by watching videos of animals running and jumping...
The next challenge is known as sim-to-real; that is, taking what the system has learned in simulation and getting it to work in a physical robot. This is tricky because a simulation is an imperfect and highly-simplified version of the real world. Mass and friction are represented as accurately as possible, but not perfectly. The actions of the simulated robot in the digital world don’t map precisely to movements of the real robot in the lab.
Read more...
Source: WIRED