Users may soon be able to control a robot that is miles and miles away through virtual reality.
Computer scientists from Brown University have developed new software that enables users to control robots remotely using virtual reality, where the data is transferred between the robot and the virtual reality unit and sent over the internet, enabling users to guide the robots from long distances.
“In VR, people can just move the robot like they move their bodies, and so they can do it without thinking about it,” Eric Rosen, an undergraduate student at Brown, said in a statement. “That lets people focus on the problem or task at hand without the increased cognitive load of trying to figure out how to move the robot.”
The new software connects a robot’s arms and grippers, as well as its onboard cameras and sensors to off-the-shelf virtual reality hardware through the internet.
Users can control the position of the robot’s arm with a handheld controller to allow the robot to perform intricate manipulation tasks just by moving their own arms.
They can also step into the robot’s metal skin and get a first-person view of the surrounding, walking around as the robot to survey the scene in the third person.
“We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn’t be,” David Whitney, a graduate student at Brown who co-led the development of the system, said in a statement. “Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station.”
Even highly sophisticated robots are often remotely controlled using a keyboard or something like a video game controller and a two-dimensional monitor.
“For things like operating a robotic arm with lots of degrees of freedom, keyboards and game controllers just aren’t very intuitive,” Whitney said.
The software links together a Baxter research robot with an HTC Vive, a virtual reality system that comes with hand controllers and uses the robot’s sensors to create a point-cloud model of the robot itself and its surroundings, which is transmitted to a remote computer connected to the Vive.
During the study, the researchers created an immersive experience for users while keeping the data load small enough to be carried over the internet without a lag.
In additional studies, 18 novice users were able to complete the cup-stacking task 66 percent faster in virtual reality compared with a traditional keyboard-and-monitor interface.
The researchers now plan to try tasks that are more complex and eventually combine manipulation with navigation.
The researchers have made the system freely available on the web.