Through the eyes of animals

December 03, 2019

Humans are now closer to seeing through the eyes of animals, thanks to an innovative software framework developed by researchers from the University of Queensland and the University of Exeter.

PhD candidate Cedric van den Berg from UQ's School of Biological Sciences said that, until now, it has been difficult to understand how animals really saw the world.

"Most animals have completely different visual systems to humans, so - for many species - it is unclear how they see complex visual information or colour patterns in nature, and how this drives their behaviour," he said.

"The Quantitative Colour Pattern Analysis (QCPA) framework is a collection of innovative digital image processing techniques and analytical tools designed to solve this problem.

"Collectively, these tools greatly improve our ability to analyse complex visual information through the eyes of animals."

Dr Jolyon Troscianko the study's co-leader from the University of Exeter said colour patterns have been key to understanding many fundamental evolutionary problems, such as how animals signal to each other or hide from predators.

"We have known for many years that understanding animal vision and signalling depends on combining colour and pattern information, but the available techniques were near impossible to implement without some key advances we developed for this framework."

The framework's use of digital photos means it can be used in almost any habitat - even underwater - using anything from off-the-shelf cameras to sophsiticated full-spectrum imaging systems.

"You can even access most of its capabilities by using a cheap (~ $110 AUD, £60 GBP, $80 USD) smartphone to capture photos," Dr Troscianko said.

It took four years to develop and test the technology, which included the development of an extensive interactive online platform to provide researchers, teachers and students with user-guides, tutorials and worked examples of how to use the tools.

UQ's Dr Karen Cheney said the framework can be applied to a wide range of environmental conditions and visual systems.

"The flexibility of the framework allows researchers to investigate the colour patterns and natural surroundings of a wide range of organisms, such as insects, birds, fish and flowering plants," she said.

"For example, we can now truly understand the impacts of coral bleaching for camouflaged reef creatures in a new and informative way."

"We're helping people - wherever they are - to cross the boundaries between human and animal visual perception."

"It's really a platform that anyone can build on, so we're keen to see what future breakthroughs are ahead."
-end-
It is published in Methods in Ecology and Evolution (DOI: 10.1111/2041-210X.13328).

University of Queensland

Related Visual Information Articles from Brightsurf:

Visual working memory is hierarchically structured
Researchers from HSE University and the University of California San Diego, Igor Utochkin and Timothy Brady, have found new evidence of hierarchical encoding of images in visual working memory.

The mystery of visual stability
We move our eyes several times per second. These fast eye movements, called saccades, create large image shifts on the retina -- making our visual system work hard to maintain a stable perceptual world.

Why visual perception is a decision process
A popular theory in neuroscience called predictive coding proposes that the brain produces all the time expectations that are compared with incoming information.

Visual impairment among women and dementia risk
Whether visual impairment is a risk factor for dementia was the focus of this observational study that included 1,000 older women who are participants in the Women's Health Initiative studies.

VR is not suited to visual memory?!
Toyohashi university of technology researcher and a research team at Tokyo Denki University have found that virtual reality (VR) may interfere with visual memory.

Dartmouth study finds conscious visual perception occurs outside the visual system
A Dartmouth study finds that the conscious perception of visual location occurs in the frontal lobes of the brain, rather than in the visual system in the back of the brain.

People with autism show atypical brain activity when coordinating visual and motor information
The brain is organized differently in individuals with ASD in its function for basic sensorimotor behaviors, but these functions can differ between people with autism.

Learning to read boosts the visual brain
How does learning to read change our brain? Does reading take up brain space dedicated to seeing objects such as faces, tools or houses?

How brain rhythms organize our visual perception
Imagine that you are watching a crowded hang-gliding competition, keeping track of a red and orange glider's skillful movements.

Seeing it both ways: Visual perspective in memory
Think of a memory from your childhood. Are you seeing the memory through your own eyes, or can you see yourself, while viewing that child as if you were an observer?

Read More: Visual Information News and Visual Information Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.