December 15, 2015 - 11:53 AMT
NASA's using PlayStation VR to train humanoid space robots

While NASA has a long history of sending probes and rovers into space, advancements in robotics has made deployment of human-like robots an increasingly attractive prospect. But it turns out that controlling such humanoid robots remotely is challenging. NASA and Sony have been collaborating to explore how VR might be used to train operators to control robots in space, Road To VR reports.

While probes and drones are great at what they do, they are very specialized. The appeal of humanoid robots—those that mimic human form and dexterity—is their flexibility. Humans are amazing generalists, using our brains to achieve things that our bodies were never made for (like space travel). Part of what makes us so adaptable to diverse situations is our bipedal stance which frees up our arms, hands, and fingers for tasks (instead of locomotion), and the use of tools. Our hands can grip and manipulate a breadth of forms unmatched by machines—but robots are quickly catching on.

NASA’s Robonaut 2 is a humanoid robot designed with arms, hands, and fingers that move just like ours. But designing dextrous robots for space is only half the challenge to actually making them useful.

While NASA is highly experienced in controlling probes and rovers with carefully planned math-based maneuvers, human control is quick and intuitive; Robonaut 2’s human dexiety is wasted if commands can’t be executed with human fluidity and improvisation.

So NASA is exploring how to control humanoid robots with human input. Modern virtual reality, as it turns out, may provide the best way to do just that—by making the robot mimic the input of a remote operator—and NASA collaborated with Sony to create a PlayStation VR tech demo called Mighty Morphenaut to explore how this might work.

“The hope is that by putting people in an environment where they can look around and move in ways that are much more intuitive than with a mouse and keyboard, it would require less training to understand how to operate the robot and enable quicker, more direct control of the motion,” Garrett Johnson, Software Engineer at NASA’s JPL said.