A robotic stand mimics movements in VR

Newswise — Media Note: VRoxy images can be viewed and downloaded here: https://cornell.box.com/v/VRoxyrobotproxy

ITHACA, New York – Researchers at Cornell and Brown University have developed a sophisticated telepresence robot that automatically and real-time responds to the movements and gestures of a remote user made in virtual reality.

The robotic system, called VRoxy, allows a remote user in a small space, such as an office, to collaborate via VR with teammates in a much larger space. VRoxy represents the latest remote, robotic incarnation.

Using a VR headset, the user has access to two viewing modes: Live mode displays an immersive image of the co-working space for real-time interaction with local collaborators, while Navigation mode displays rendered paths of the room, allowing remote users to “teleport”. Where they want to go. This navigation mode provides faster and smoother mobility for the remote user and limits motion sickness.

The automated nature of the system allows remote teammates to focus solely on collaboration rather than manually controlling the robot, the researchers said.

“A big benefit of virtual reality is that we can use all kinds of movement techniques that people use in virtual reality games, like moving from one position to another instantly,” said Moses Sakashita, a doctoral student in information science at Cornell. “This functionality allows remote users to physically occupy a very limited space, but collaborate with teammates in a much larger remote environment.”

Sakashita is the main author of “VRoxy: Wide-area collaboration from the office using a VR-centric robotic proxy,” will be presented at the ACM Symposium on User Interface Software and Technology (UIST), Oct. 29-Nov. 1.

According to the researchers, VRoxy's automated, real-time response is key for both remote and local teammates. With a robot proxy like VRoxy, a remote teammate confined to a small office can interact in a group activity that takes place in a much larger space, as in a design collaboration scenario.

For teammates, the VRoxy robot automatically mimics the user's body position and other vital non-verbal cues that are otherwise lost with telepresence robots and Zoom. For example, VRoxy's monitor – which displays a rendering of the user's face – will tilt accordingly based on the user's focus.

It features a 360-degree camera, a monitor that displays facial expressions captured by the user's VR headset, a robotic index finger, and omnidirectional wheels.

In future work, Sakashita wants to retool VRoxy with robotic arms, allowing remote users to interact with physical objects in living space through a robot proxy.

This research was supported by the National Science Foundation and the Nakajima Foundation.

For more information, Check out this Cornell Chronicle story.

Cornell University has dedicated television and audio studios available for media interviews.