Disney Research algorithms improve animations featuring fog, smoke and underwater scenes

November 18, 2013

A team led by Disney Research, Zürich has developed a method to more efficiently render animated scenes that involve fog, smoke or other substances that affect the travel of light, significantly reducing the time necessary to produce high-quality images or animations without grain or noise.

The method, called joint importance sampling, helps identify potential paths that light can take through a foggy or underwater scene that are most likely to contribute to what the camera - and the viewer - ultimately sees. In this way, less time is wasted computing paths that aren't necessary to the final look of an animated sequence.

Wojciech Jarosz, a research scientist at Disney Research, Zürich, said the computation time needed to produce noise-free images when rendering a complex scene can take minutes, hours or even days. The new algorithms his team created can reduce that time dramatically, by a factor of 10, 100, or even up to 1,000 in their experiments.

"Faster renderings allow our artists to focus on the creative process instead of waiting on the computer to finish," Jarosz said. "This leaves more time for them to create beautiful imagery that helps create an engaging story."

The researchers, including collaborators from Saarland University, Aarhus University, Université de Montréal and Charles University, Prague, will present their findings at the ACM SIGGRAPH Asia 2013 conference, November 19-22, in Hong Kong.

Light rays are deflected or scattered not only when they bounce off a solid object, but also as they pass through aerosols and liquids. The effect of clear air is negligible for rendering algorithms used to produce animated films, but realistically producing scenes including fog, smoke, smog, rain, underwater scenes, or even a glass of milk requires computational methods that account for these "participating media."

So-called Monte Carlo algorithms are increasingly being used to render such phenomena in animated films and special effects. These methods operate by analyzing a random sampling of possible paths that light might take through a scene and then averaging the results to create the overall effect. But Jarosz explained that not all paths are created equal. Some paths end up being blocked by an object or surface in the scene; in other cases, a light source may simply be too far from the camera to have much chance of being seen. Calculating those paths can be a waste of computing time or, worse, averaging them may introduce error, or noise, that creates unwanted effects in the animation.

Computer graphics researchers have tried various "importance sampling" techniques to increase the probability that the random light paths calculated will ultimately contribute to the final scene and keep noise to a minimum. Some techniques trace the light from its source to the camera; others from the camera back to the source. Some are bidirectional - tracing the light from both the camera and the source before connecting them together. Unfortunately, even such sophisticated bidirectional techniques compute the light and camera portions of the paths independently, without knowledge of each other, before connecting them together, so they are unlikely to construct full light paths that ultimately have a strong contribution to the final image.

By contrast, the joint importance sampling method developed by the Disney Research team chooses the locations along the random paths with mutual knowledge of the camera and light source locations. This approach allows their method to create high-contribution paths more readily, increasing the efficiency of the rendering process.

The researchers found that their algorithms significantly reduced noise and improved rendering performance. "There's always going to be noise, but with our method, we can reduce the noise much more quickly, which can translate into savings of time, computer processing and ultimately money," Jarosz said.
-end-
For more information on Joint Path Importance Sampling for Rendering Low Order Anisotropic Scattering, please visit the project website at http://www.disneyresearch.com/project/joint-importance-sampling/.

About Disney Research


Disney Research is a network of research laboratories supporting The Walt Disney Company. Its purpose is to pursue scientific and technological innovation to advance the company's broad media and entertainment efforts. Disney Research is managed by an internal Disney Research Council co-chaired by Disney-Pixar's Ed Catmull and Walt Disney Imagineering's Bruce Vaughn, and including the Directors of the individual labs. It has facilities in Los Angeles, San Francisco, Pittsburgh, Boston and Zürich. Research topics include computer graphics, video processing, computer vision, robotics, radio and antennas, wireless communications, human-computer interaction, displays, data mining, machine learning and behavioral sciences.

Disney Research

Related Algorithms Articles from Brightsurf:

A multidisciplinary policy design to protect consumers from AI collusion
Legal scholars, computer scientists and economists must work together to prevent unlawful price-surging behaviors from artificial intelligence (AI) algorithms used by rivals in a competitive market, argue Emilio Calvano and colleagues in this Policy Forum.

Students develop tool to predict the carbon footprint of algorithms
Within the scientific community, it is estimated that artificial intelligence -- otherwise meant to serve as a means to effectively combat climate change -- will become one of the most egregious CO2 culprits should current trends continue.

Machine learning takes on synthetic biology: algorithms can bioengineer cells for you
Scientists at Lawrence Berkeley National Laboratory have developed a new tool that adapts machine learning algorithms to the needs of synthetic biology to guide development systematically.

Algorithms uncover cancers' hidden genetic losses and gains
Limitations in DNA sequencing technology make it difficult to detect some major mutations often linked to cancer, such as the loss or duplication of parts of chromosomes.

Managing data flow boosts cyber-physical system performance
Researchers have developed a suite of algorithms to improve the performance of cyber-physical systems - from autonomous vehicles to smart power grids - by balancing each component's need for data with how fast that data can be sent and received.

New theory hints at more efficient way to develop quantum algorithms
A new theory could bring a way to make quantum algorithm development less of an accidental process, say Purdue University scientists.

AI as good as the average radiologist in identifying breast cancer
Researchers at Karolinska Institutet and Karolinska University Hospital in Sweden have compared the ability of three different artificial intelligence (AI) algorithms to identify breast cancer based on previously taken mammograms.

Context reduces racial bias in hate speech detection algorithms
When it comes to accurately flagging hate speech on social media, context matters, says a new USC study aimed at reducing errors that could amplify racial bias.

Researchers discover algorithms and neural circuit mechanisms of escape responses
Prof. WEN Quan from School of Life Sciences, University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) has proposed the algorithms and circuit mechanisms for the robust and flexible motor states of nematodes during escape responses.

Lightning fast algorithms can lighten the load of 3D hologram generation
Tokyo, Japan - Researchers from Tokyo Metropolitan University have developed a new way of calculating simple holograms for heads-up displays (HUDs) and near-eye displays (NEDs).

Read More: Algorithms News and Algorithms Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.