Artificial intelligence researchers are showing a bipedal robot that actually learned to walk in a simulation.
The robot awkwardly takes small steps across a hall, attached to the kind of support structure that it grips in case of an emergency. Should this be impressive? After all, Boston Dynamics’ human Atlas robot can actually run parkour, jump, or dance.
As is often the case, the devil is in the details. Because anyone who knows marketing videos from Boston Dynamics quickly thinks robots can actually walk on two legs – issue resolved. But behind every impressive step by Atlas, there are often months of work. Boston Dynamics’ meticulous process of making robot legs is a trade secret. But there is still a lot of manual work involved in the motor skills of robots like Atlas or Spot.
On the other hand, a relatively less agile support frame robot learned to walk upright on its own: in a simulation. His meager moves clearly show that the learning process is not yet complete. But the researchers involved are contributing to the great dream of robots: robots that independently learn complex motor skills through reinforcement learning in a simulation and can reliably apply them in the real world under ever-changing conditions.
The legged robot Cassie was built by researchers from the University of California, Berkeley. Cassie can run in different directions, turn, bend over, carry a load, and find his stance even if he slips.
The robot is completely autonomous and only has a kinesthetic intelligence: the direction of movement is determined by the human with a controller, but the movements of the two metal legs are controlled by the AI.
Artificial intelligence simulation training with artificial obstacles
Cassie was trained in two steps: First, the AI learns a series of movements from a database etc. about how to move legs using motors. Equipped with the basic engine architecture, AI begins simulation training.
In the simulation, the AI receives random motion input and has to control a virtual Cassie robot according to the input. Sometimes it advances forward, sometimes backward, sometimes to the side, and turns are necessary for running around obstacles. Through trial and error, Cassie AI learns to walk the virtual world without incident.
In order for Cassie to perform well in the real world, the researchers randomly altered a number of factors in the simulation. For example, they simulated short-range sensor errors or communication delays between AI instructions and the reaction of Cassi engines.
After training with the simulation, the researchers tested Cassie in a second simulation that depicted real-world physics as accurately as possible. In this way, researchers can test the locomotor behavior learned in the first simulation before actual use, which is most suitable for real use.
Thanks to AI training, a real Cassie robot can run on different surfaces without further fine-tuning and can handle external forces such as shocks. Cassie managed to prove herself directly in case of a real mistake: she compensated for the failure of two motors in one leg and kept her balance. This position was not part of the planned training and it showed that Cassie had learned strong motor skills.
Virtualization is accelerating the development of artificial intelligence
The training approach used at Cassie could be used in the future to rapidly develop robot prototypes and help bring the often excellent performance of robots – or AI in general – into reality in simulations.
Bridging the ‘reality gap’ is one of the major challenges of augmented learning in robotics: the transfer of simulation performance to the real world is discussed under the heading ‘Sim2Real’ and the discourse is defined by the fact that AI is failing in reality due to tasks or situations it can still master in Simulation (or even try it).
But it will be some time before Cassie, who runs the parkour in style, dances. The researchers then want to improve Cassie’s walking skills and teach the robot new movements.
An advertisement-friendly bot like Atlas can still be created in the process: a YouTube video shows a robot driving a snake board in a simulation. With this controllable variant of a skateboard, the legged man curves around obstacles.
The group is also looking for a robot companion for the blind and partially sighted. You can see the prototype in action in the next video.
Banner photo: Li et al. | Across: Arxiv
Read more about robots:
The AI gets you moving: The robot learns to walk independently Last update: April 10, 2021 by