The first robot that became aware of his body independently

We humans learn to know and acquire knowledge of our body since we are children , in an autonomous way. So far, no robot has been developed with this capability but a team from Columbia Engineering has announced that it has created a robot that, for the first time, is able to learn a model of the whole body from scratch, without any human help. . He did it by putting himself in the "mirror", which were actually cameras pointing towards him, and practicing from scratch he reached full awareness of his body in just 3 hours .

A robot that has become aware of itself by "looking in the mirror"

In a new study published by Science Robotics, the researchers demonstrate how their robot created a kinematic model of itself and then used its own model to plan movements , achieve goals and avoid obstacles in a variety of situations.

This represents an important technological advance for the robots of the future , robots capable of interacting naturally with the environment around them. As every person knows (or should know), our body image is not always 100% accurate or realistic, but it is enough to determine how we function and act in the world. This information allows our brain to constantly plan the movements to be made in advance , so that we can move the body without hitting, tripping or falling.

Hod Lipson, professor of mechanical engineering and director of Columbia's Creative Machines Lab, where the work was done, says:

“We were really curious to see how the robot imagined (…) but you can't just peek into a neural network, it's a black box. (…) it was a sort of flickering cloud that seemed to engulf the three-dimensional body of the robot (…) as the robot moved, the flickering cloud followed it gently. "

robot

Video cameras were used as "mirrors"

To get to know themselves, the researchers placed a robotic arm within a circle of five cameras . The robot was then observed through these cameras as it swayed freely. Like a child exploring himself for the first time in a hall of mirrors, the robot squirmed and writhed to learn exactly how its body moved in response to various motor commands and the limits of the movements it could make. After about three hours, the robot stopped. The deep neural network on which his intelligence is based had finished learning the relationship between the robot's motor actions and the volume it occupied in its environment.

The ability of robots to model themselves without being assisted by engineers is important for many reasons: it not only saves labor, but also allows the robot to keep up with its own wear and even detect and compensate for damage. The authors argue that this capability is important as we need autonomous systems to be more self-sufficient . A factory robot, for example, might detect that something is not going as it should and compensate or request assistance as a result.

Boyuan Chen, who led the work and is now an assistant professor at Duke University, explained that:

“We humans clearly have a notion of ourselves (…) close your eyes and try to imagine how your body would move if you were to take some action, such as stretching your arms forward or taking a step back. Somewhere in our brain we have a notion of ourselves, a model of our body that informs us what volume of our immediate surroundings we occupy and how that volume changes as we move. "

The article The first robot that autonomously became aware of its body was written on: Tech CuE | Close-up Engineering .