Analysis of complex geometric models made simple

June 29, 2020

PITTSBURGH--Researchers at Carnegie Mellon University have developed an efficient new way to quickly analyze complex geometric models by borrowing a computational approach that has made photorealistic animated films possible.

Rapid improvements in sensor technology have generated vast amounts of new geometric information, from scans of ancient architectural sites to the internal organs of humans. But analyzing that mountain of data, whether it's determining if a building is structurally sound or how oxygen flows through the lungs, has become a computational chokepoint.

"The data has become a monster," said Keenan Crane, assistant professor of computer science and robotics. "Suddenly, you have more data than you can possibly analyze -- or even care about."

Crane and Rohan Sawhney, a Ph.D. student in the Computer Science Department, are taming the monster by using so-called Monte Carlo methods to simulate how particles, heat and other things move through or within a complex shape. The process eliminates the need to painstakingly divide shapes into meshes -- collections of small geometric elements that can be computationally analyzed. The researchers will present their method at the SIGGRAPH 2020 Conference on Computer Graphics and Interactive Techniques, which will be held virtually in July.

"Building meshes is a minefield of possible errors," said Sawhney, the lead author. "If just one element is distorted, it can throw off the entire computation. Eliminating the need for meshes is pretty huge for a lot of industries."

Meshing was also a tough problem for filmmakers trying to create photorealistic animations in the 1990s. Not only was meshing laborious and slow, but the results didn't look natural. Their solution was to add randomness to the process by simulating light rays that could bounce around a scene. The result was beautifully realistic lighting, rather than flat-looking surfaces and blocky shadows.

Likewise, Crane and Sawhney have embraced randomness in geometric analysis. They aren't bouncing light rays through structures, but they are using Monte Carlo methods to imagine how particles, fluids or heat randomly interact and move through space. First developed in the 1940s and 1950s for the U.S. nuclear weapons program, Monte Carlo methods are a class of algorithms that use randomness in an ordered way to produce numerical results.

Crane and Sawhney's work revives a little-used "walk on spheres" algorithm that makes it possible to simulate a particle's long, random walk through a space without determining each twist and turn. Instead, they calculate the size of the largest empty space around the particle -- in the lung, for instance, that would be the width of a bronchial tube -- and make that the diameter of each sphere. The program can then just jump from one random point on each sphere to the next to simulate the random walk.

While it might take a day just to build a mesh of a geometric space, the CMU approach allows users to get a rough preview of the solution in just a few seconds. This preview can then be refined by taking more and more random walks.

"That means one doesn't have to sit around, waiting for the analysis to be completed to get the final answer," Sawhney said. "Instead, the analysis is incremental, providing engineers with immediate feedback. This translates into more time doing and less time banging one's head against the wall trying to understand why the analysis isn't working."

Sawhney and Crane are working with industry partners to expand the kinds of problems that can be solved with their methods. The National Science Foundation, Packard Fellowship, Sloan Foundation, Autodesk, Adobe, Disney and Facebook provided support for this work.

Carnegie Mellon University

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to