Nav: Home

Texas A&M researchers help give robotic arms a steady hand for surgeries

April 29, 2020

Steady hands and uninterrupted, sharp vision are critical when performing surgery on delicate structures like the brain or hair-thin blood vessels. While surgical cameras have improved what surgeons see during operative procedures, the "steady hand" remains to be enhanced -- new surgical technologies, including sophisticated surgeon-guided robotic hands, cannot prevent accidental injuries when operating close to fragile tissue.

In a new study published in the January issue of the journal Scientific Reports, researchers at Texas A&M University show that by delivering small, yet perceptible buzzes of electrical currents to fingertips, users can be given an accurate perception of distance to contact. This insight enabled users to control their robotic fingers precisely enough to gently land on fragile surfaces.

The researchers said that this technique might be an effective way to help surgeons reduce inadvertent injuries during robot-assisted operative procedures.

"One of the challenges with robotic fingers is ensuring that they can be controlled precisely enough to softly land on biological tissue," said Hangue Park, assistant professor in the Department of Electrical and Computer Engineering. "With our design, surgeons will be able to get an intuitive sense of how far their robotic fingers are from contact, information they can then use to touch fragile structures with just the right amount of force."

Robot-assisted surgical systems, also known as telerobotic surgical systems, are physical extensions of a surgeon. By controlling robotic fingers with movements of their own fingers, surgeons can perform intricate procedures remotely, thus expanding the number of patients that they can provide medical attention. Also, the tiny size of the robotic fingers means that surgeries are possible with much smaller incisions since surgeons need not make large cuts to accommodate for their hands in the patient's body during operations.

To move their robotic fingers precisely, surgeons rely on live streaming of visual information from cameras fitted on telerobotic arms. Thus, they look into monitors to match their finger movements with those of the telerobotic fingers. In this way, they know where their robotic fingers are in space and how close these fingers are to each other.

However, Park noted that just visual information is not enough to guide fine finger movements, which is critical when the fingers are in the close vicinity of the brain or other delicate tissue.

"Surgeons can only know how far apart their actual fingers are from each other indirectly, that is, by looking at where their robotic fingers are relative to each other on a monitor," Park said. "This roundabout view diminishes their sense of how far apart their actual fingers are from each other, which then affects how they control their robotic fingers."

To address this problem, Park and his team came up with an alternate way to deliver distance information that is independent of visual feedback. By passing different frequencies of electrical currents onto fingertips via gloves fitted with stimulation probes, the researchers were able to train users to associate the frequency of current pulses with distance, that is, increasing current frequencies indicated the closing distance from a test object. They then compared if users receiving current stimulation along with visual information about closing distance on their monitors did better at estimating proximity than those who received visual information alone.

Park and his team also tailored their technology according to the user's sensitivity to electrical current frequencies. In other words, if a user was sensitive to a wider range of current frequencies, the distance information was delivered with smaller steps of increasing currents to maximize the accuracy of proximity estimation.

The researchers found that users receiving electrical pulses were more aware of the proximity to underlying surfaces and could lower their force of contact by around 70%, performing much better than the other group. Overall, they observed that proximity information delivered through mild electric pulses was about three times more effective than the visual information alone.

Park said their novel approach has the potential to significantly increase maneuverability during surgery while minimizing risks of unintended tissue damage. He also said their technique would add little to the existing mental load of surgeons during operative procedures.

"Our goal was to come up with a solution that would improve the accuracy in proximity estimation without increasing the burden of active thinking needed for this task," he said. "When our technique is ready for use in surgical settings, physicians will be able to intuitively know how far their robotic fingers are from underlying structures, which means that they can keep their active focus on optimizing the surgical outcome of their patients."
-end-
Other contributors to the research include Ziqi Zhao, Minku Yeo and Stefan Manoharan from the Texas A&M Department of Electrical and Computer Engineering, and Seok Chang Ryu from Ewha Womans University, South Korea.

YouTube link: https://www.youtube.com/watch?v=5KRSRfeL_tE&feature=emb_logo

Texas A&M University

Related Visual Information Articles:

Why visual perception is a decision process
A popular theory in neuroscience called predictive coding proposes that the brain produces all the time expectations that are compared with incoming information.
Visual impairment among women and dementia risk
Whether visual impairment is a risk factor for dementia was the focus of this observational study that included 1,000 older women who are participants in the Women's Health Initiative studies.
VR is not suited to visual memory?!
Toyohashi university of technology researcher and a research team at Tokyo Denki University have found that virtual reality (VR) may interfere with visual memory.
Dartmouth study finds conscious visual perception occurs outside the visual system
A Dartmouth study finds that the conscious perception of visual location occurs in the frontal lobes of the brain, rather than in the visual system in the back of the brain.
People with autism show atypical brain activity when coordinating visual and motor information
The brain is organized differently in individuals with ASD in its function for basic sensorimotor behaviors, but these functions can differ between people with autism.
Learning to read boosts the visual brain
How does learning to read change our brain? Does reading take up brain space dedicated to seeing objects such as faces, tools or houses?
How brain rhythms organize our visual perception
Imagine that you are watching a crowded hang-gliding competition, keeping track of a red and orange glider's skillful movements.
Seeing it both ways: Visual perspective in memory
Think of a memory from your childhood. Are you seeing the memory through your own eyes, or can you see yourself, while viewing that child as if you were an observer?
Using visual imagery to find your true passions
You may think you know what you like -- how to spend your time or what profession to pursue.
VisiBlends, a new approach to disrupt visual messaging
To help non-professionals create visual blends for their news and PSAs, Columbia Engineering researchers have developed VisiBlends, a flexible, user-friendly platform that transforms the creative brainstorming activity into a search function, and enables a statistically higher output of visually blended images.
More Visual Information News and Visual Information Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Listen Again: Meditations on Loneliness
Original broadcast date: April 24, 2020. We're a social species now living in isolation. But loneliness was a problem well before this era of social distancing. This hour, TED speakers explore how we can live and make peace with loneliness. Guests on the show include author and illustrator Jonny Sun, psychologist Susan Pinker, architect Grace Kim, and writer Suleika Jaouad.
Now Playing: Science for the People

#565 The Great Wide Indoors
We're all spending a bit more time indoors this summer than we probably figured. But did you ever stop to think about why the places we live and work as designed the way they are? And how they could be designed better? We're talking with Emily Anthes about her new book "The Great Indoors: The Surprising Science of how Buildings Shape our Behavior, Health and Happiness".
Now Playing: Radiolab

The Third. A TED Talk.
Jad gives a TED talk about his life as a journalist and how Radiolab has evolved over the years. Here's how TED described it:How do you end a story? Host of Radiolab Jad Abumrad tells how his search for an answer led him home to the mountains of Tennessee, where he met an unexpected teacher: Dolly Parton.Jad Nicholas Abumrad is a Lebanese-American radio host, composer and producer. He is the founder of the syndicated public radio program Radiolab, which is broadcast on over 600 radio stations nationwide and is downloaded more than 120 million times a year as a podcast. He also created More Perfect, a podcast that tells the stories behind the Supreme Court's most famous decisions. And most recently, Dolly Parton's America, a nine-episode podcast exploring the life and times of the iconic country music star. Abumrad has received three Peabody Awards and was named a MacArthur Fellow in 2011.