100 percent of the image restored using a version containing between 1 and 10 percent of the information

October 24, 2013

In his PhD thesis, Daniel Paternain-Dallo, Computer Engineer of the NUP/UPNA-Public University of Navarre, has developed algorithms to reduce and optimize images; using a reduced image (with between 1% and 10% of the information from the original image), they allow 100% of the pixels in the initial image to be restored. "With these algorithms we can obtain high quality images that are very similar to the original. We have shown that even if we lose 100% of the pixels of the image, we can restore a lost image with a very high level of quality just by using the information from the reduced image." The PhD thesis is entitled Optimization of image reduction and restoration algorithms based on penalty functions and aggregation techniques.

Daniel Paternain's research comes within the framework of the digital processing of images, a discipline that has burgeoned tremendously over the last forty years. In fact, the high quality of current digital images is partly due to the fact that there is increasingly greater spatial resolution (higher number of pixels); in other words, it is possible to use a much larger quantity of information to represent the same scene.

As the researcher points out, the two main problems of high resolution images are the cost in storing or transmitting them (over the Internet, for example) and the long period of time that computers take to process them. To solve these two problems at the same time, Daniel Paternain's thesis puts forward various algorithms to reduce images in terms of both colour and greyscales. "The aim," he explains, "is to reduce the number of pixels the image contains while trying to keep all or as much as possible of the information and properties contained in the original image."

The main idea underpinning the algorithms developed is to divide the image into small zones that are processed individually. "For each zone we look for a value that is simultaneously the least different from all the pixels that form the zone. By following this methodology, we can design algorithms that are very efficient in terms of execution time, and capable of being adapted to the local properties of each zone of the image."

Firstly, he developed an algorithm to reduce the images on the greyscale. Aggregation functions are used to achieve this; "they are highly applicable because they study the way of combining various homogeneous or heterogeneous sources of information into a single value to represent them." Furthermore, for colour images in which each pixel contains a larger amount of information, he studied the so-called penalty functions. "This mathematical tool enables us by means of optimization algorithms to automatically select the aggregation function most suited to each zone of the colour image."

Image restoration

The final step in his research explored how to apply the reduction algorithms to one of the most difficult problems in image processing: restoring digital images. "Let us assume that we lose a large quantity of pixels owing to a transmission error or a problem when processing the image," explains Paternain. The restoration algorithm seeks to estimate the original value of the pixels we have lost and to obtain an image as similar as possible to the original."

To make the restoration possible, it is necessary to have available in advance a highly reduced version of the original image that will concentrate most of its properties. The more information we have stored in the reduced image, the greater the quality of the restored image will be. "This reduced version cannot be very big as we don't want to excessively increase the cost of storing the image. The reduced images we obtain through these algorithms account for between 1% and 10% of the original image." After that, an optimization algorithm is generated; it is capable of estimating the value of the lost pixels using the information contained in the damaged image as well as in the reduced image.

"We have shown that by using the algorithms proposed in this thesis, we can obtain images of high quality that are very similar to the original. We have shown that even if we lose 100% of the pixels of the image, we can, with a very high level of quality, restore an image that has been completely lost, just by using the information from the reduced image."
Daniel Paternain graduated in Computer Engineering at the NUP/UPNA-Public University of Navarre (2008), where he works as an assistant lecturer. He is the author of 8 papers published in international journals, 3 chapters in international books, and thirty contributions at conferences, twenty-six of which were in the international sphere. He is a reviewer of international journals of recognised prestige, and assistant to the editor-in-chief of the Mathware & Soft Computing journal.

Elhuyar Fundazioa

Related Algorithms Articles from Brightsurf:

A multidisciplinary policy design to protect consumers from AI collusion
Legal scholars, computer scientists and economists must work together to prevent unlawful price-surging behaviors from artificial intelligence (AI) algorithms used by rivals in a competitive market, argue Emilio Calvano and colleagues in this Policy Forum.

Students develop tool to predict the carbon footprint of algorithms
Within the scientific community, it is estimated that artificial intelligence -- otherwise meant to serve as a means to effectively combat climate change -- will become one of the most egregious CO2 culprits should current trends continue.

Machine learning takes on synthetic biology: algorithms can bioengineer cells for you
Scientists at Lawrence Berkeley National Laboratory have developed a new tool that adapts machine learning algorithms to the needs of synthetic biology to guide development systematically.

Algorithms uncover cancers' hidden genetic losses and gains
Limitations in DNA sequencing technology make it difficult to detect some major mutations often linked to cancer, such as the loss or duplication of parts of chromosomes.

Managing data flow boosts cyber-physical system performance
Researchers have developed a suite of algorithms to improve the performance of cyber-physical systems - from autonomous vehicles to smart power grids - by balancing each component's need for data with how fast that data can be sent and received.

New theory hints at more efficient way to develop quantum algorithms
A new theory could bring a way to make quantum algorithm development less of an accidental process, say Purdue University scientists.

AI as good as the average radiologist in identifying breast cancer
Researchers at Karolinska Institutet and Karolinska University Hospital in Sweden have compared the ability of three different artificial intelligence (AI) algorithms to identify breast cancer based on previously taken mammograms.

Context reduces racial bias in hate speech detection algorithms
When it comes to accurately flagging hate speech on social media, context matters, says a new USC study aimed at reducing errors that could amplify racial bias.

Researchers discover algorithms and neural circuit mechanisms of escape responses
Prof. WEN Quan from School of Life Sciences, University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) has proposed the algorithms and circuit mechanisms for the robust and flexible motor states of nematodes during escape responses.

Lightning fast algorithms can lighten the load of 3D hologram generation
Tokyo, Japan - Researchers from Tokyo Metropolitan University have developed a new way of calculating simple holograms for heads-up displays (HUDs) and near-eye displays (NEDs).

Read More: Algorithms News and Algorithms Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.