Force Push VR brings Jedi powers to life

November 30, 2018

Fans of the Star Wars franchise will have to wait more than a year from now to get their fix of Jedi-laden telekinetic spectacles on the big screen. The as-of-yet-to-be-titled Episode IX, the last installment of the space saga as was envisioned in 1977, won't be released until December 2019.

In the interim, stalwart practitioners of Jedi ways and other Force-sensitive beings can look to the small screen and thank Virginia Tech researchers for a recently developed virtual reality technique called Force Push.

Force Push gives its users the ability to move faraway objects with Yoda-like calm, nuance, and focus using an approach for remote object manipulation in VR.

"You basically push the object in the direction you want it to move to, just like in Star Wars when the Jedi masters try to move an object that's placed remotely, they can push or pull it," said Run Yu, Ph.D. candidate in the Department of Computer Science and the Institute for Creativity, Technology, and the Arts. Yu is first author on the recently published article in Frontiers in ICT detailing the research.

It's as simple as using subtle hand gestures to push, pull, or twirl objects. Users employ their bare hands using a natural gesture-to-action mapping for object manipulation in a VR setting.

"We wanted to try and do this without any device, just using your hands, and also do it with gestures in a way that's more playful," said Doug Bowman, the Frank J. Maher Professor of Computer Science and director of the Center for Human Computer Interaction.

Force Push provides a more physical, nuanced experience than traditional hand controllers allow in VR. It responds to the speed and magnitude of hand gestures to accelerate or decelerate objects in a way that users can understand intuitively.

The ability to respond to nuanced hand movement is due to the technique's novel physics-driven algorithms. Dynamically mapping rich features of input gestures to properties of physics-based simulation made the interface controllable in most cases. With Force Push, it's just as easy for users to apply the gentlest of nudges to an object as it is to throw a heavy object across the room. The researchers also believe the physics-based technique makes Force Push more plausible, so that users have a "realistic" experience of these magical powers.

To perform user experiments the team used an Oculus Rift CV1 for display and a Leap Motion was applied for hand tracking. The virtual environment was developed in the Unity game engine, and the native physics engine of Unity was used to drive the physics-based simulation of the Force Push interface.

"Every week we kind of tweak something different in order to make the experience feel right," said Bowman. "But now it feels really cool."
-end-
Written by Amy Loeffler

Virginia Tech

Related Gestures Articles from Brightsurf:

Guiding light: Skoltech technology puts a light-painting drone at your fingertips
Skoltech researchers have designed and developed an interface that allows a user to direct a small drone to light-paint patterns or letters through hand gestures.

​NTU Singapore scientists develop artificial intelligence system for high precision recognition of hand gestures
Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an Artificial Intelligence (AI) system that recognises hand gestures by combining skin-like electronics with computer vision.

Children improve their narrative performance with the help of rhythmic gestures
Gesture is an integral part of language development. Recent studies carried out by the same authors in collaboration with other members of the Prosodic Studies Group (GrEP) coordinated by Pilar Prieto, ICREA research professor Department of Translation and Language Sciences at UPF, have shown that when the speaker accompanies oral communication with rhythmic gesture, preschool children are observed to better understand the message and improve their oral skills.

Gestures heard as well as seen
Gesturing with the hands while speaking is a common human behavior, but no one knows why we do it.

Oink, oink makes the pig
In a new study, neuroscientists at TU Dresden demonstrated that the use of gestures and pictures makes foreign language teaching in primary schools more effective and sustainable.

New dog, old tricks? Stray dogs can understand human cues
Pet dogs are highly receptive to commands from their owners.

Sport-related concussions
Concussions are a regular occurrence in sport but more so in contact sports such as American football, ice hockey or soccer.

Economists find mixed values of 'thoughts and prayers'
Christians who suffer from natural and human-caused disasters value thoughts and prayers from religious strangers, while atheists and agnostics believe they are worse off from such gestures.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Gestures and visual animations reveal cognitive origins of linguistic meaning
Gestures and visual animations can help reveal the cognitive origins of meaning, indicating that our minds can assign a linguistic structure to new informational content 'on the fly' -- even if it is not linguistic in nature.

Read More: Gestures News and Gestures Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.