Accelerating robotic innovation

November 09, 2011

HOUSTON -- (Nov. 9, 2011) - Researchers from three universities are collaborating to develop a new generation of design software that can accurately predict the physical behavior of robots prior to prototyping.

"One of our goals is to find a way to do virtual testing so that key flaws can be found on a computer before a prototype is ever built," said Walid Taha, adjunct professor of computer science at Rice University and professor of computer science at Halmstad University in Sweden. Taha is principal investigator on a new research grant from the National Science Foundation (NSF) that brings together researchers from Rice, Halmstad and Texas A&M University.

Taha noted that robots are a study in contrasts. They can perform superhuman feats and get tripped up by toddler-level tasks. They're digitally programmable, but intricacies of their physical behavior go far beyond the reach of computer simulations.

"Part of the problem is that robots have a foot in both the digital and physical worlds," said robotics researcher Marcia O'Malley, professor of mechanical engineering and materials science at Rice and co-principal investigator on the new project. "Bridging these worlds is difficult. The physical world is a messy place with both smooth curves and discontinuities that are difficult for computers to deal with."

The upshot is that designing robots today goes something like this: Build computational models and test in simulation. Build prototype at great expense. Test prototype and find unanticipated flaw. Revisit simulation. Redesign prototype. Repeat.

Taha, O'Malley and their collaborators at Rice and Texas A&M hope to change that with new funding from the NSF's Cyber-Physical Systems program.

Modeling and simulation of robotics is not a new idea, but the researchers are taking a new approach. For one thing, they are keen to develop a holistic system that robotics designers can use from start to finish. Currently, designers might use four or more different pieces of software at various points in the design and testing of a new robot. Lack of compatibility from one piece of software to the next is one problem, but an even larger problem can arise when entire concepts are missing or treated wholly different.

To address this, the team includes Rice programming language expert Corky Cartwright, professor of computer science. Taha, principal investigator on the project, and Cartwright began developing a new programming language called Acumen under an earlier NSF grant. They'll continue to develop and expand the language under the new research program.

Cartwright will work with the project's two hands-on robotics laboratories -- O'Malley's Mechatronics and Haptics Interface (MAHI) lab at Rice and Aaron Ames' A&M Bipedal Experimental Robotics (AMBER) lab at Texas A&M -- to test the language and make sure it is up to the task of day-to-day robotic design. Specifically, the next generation of two-legged walking robots and robotic assistive devices will be developed by Ames and O'Malley through this new software infrastructure.

"We should be able to input into the simulation environment any equation that the mechanical engineers give us," Cartwright said.

Ames, assistant professor of mechanical engineering and of electrical and computer engineering at Texas A&M, said, "One area that stands to significantly benefit from these innovations is the design of next-generation prosthetics. The MAHI lab at Rice is already doing work on upper-body prosthetics, and the AMBER lab is working on prosthetics for the lower body. With improved modeling and simulation tools we hope to dramatically accelerate innovation in this area."
-end-
A high-resolution image is available for download at: http://www.media.rice.edu/images/media/NEWSRELS/1109_group.jpg

CAPTION: From left, Aaron Ames of Texas A&M University, Walid Taha of Halmstad University, and Corky Cartwright and Marcia O'Malley, both of Rice University, are developing new design software that can accurately predict the physical behavior of robots.

CREDIT: Jeff Fitlow/Rice University

A copy of the NSF grant abstract is available at: http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=1136099

Rice University

Related Language Articles from Brightsurf:

Learning the language of sugars
We're told not to eat too much sugar, but in reality, all of our cells are covered in sugar molecules called glycans.

How effective are language learning apps?
Researchers from Michigan State University recently conducted a study focusing on Babbel, a popular subscription-based language learning app and e-learning platform, to see if it really worked at teaching a new language.

Chinese to rise as a global language
With the continuing rise of China as a global economic and trading power, there is no barrier to prevent Chinese from becoming a global language like English, according to Flinders University academic Dr Jeffrey Gil.

'She' goes missing from presidential language
MIT researchers have found that although a significant percentage of the American public believed the winner of the November 2016 presidential election would be a woman, people rarely used the pronoun 'she' when referring to the next president before the election.

How does language emerge?
How did the almost 6000 languages of the world come into being?

New research quantifies how much speakers' first language affects learning a new language
Linguistic research suggests that accents are strongly shaped by the speaker's first language they learned growing up.

Why the language-ready brain is so complex
In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks.

Do as i say: Translating language into movement
Researchers at Carnegie Mellon University have developed a computer model that can translate text describing physical movements directly into simple computer-generated animations, a first step toward someday generating movies directly from scripts.

Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.

Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.

Read More: Language News and Language Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.