NASA Marshall scientist seeks improved methods for weather prediction in southeast U.S.

June 20, 2001

A new NASA-developed technique to improve numerical weather prediction - one that looks to the ground as well as the clouds - may one day help forecasters increase the accuracy of spring and summer weather predictions.

Atmospheric scientist Bill Lapenta, of the Global Hydrology and Climate Center, based at the National Space Science and Technology Center (NSSTC) in Huntsville, Ala., is researching a new method for improving numerical weather prediction in the Southeast United States. Funded through the U.S. Weather Research Program, the research is a cooperative effort between NASA and the National Oceanic and Atmospheric Administration (NOAA).

Numerical weather prediction is a complicated business, which uses data from many sources and combines them to form a prediction of tomorrow's weather. Like a chef creating a favorite dish, Lapenta's recipe, or equation, for weather prediction includes ingredients used by many, along with specialty items used by few.

In addition to standard data - such as current air temperature, humidity, and wind speed - he adds a dash of specialized data from Geostationary Operational Environmental Satellites maintained by NOAA.

Using the satellite data adds detailed ground-level information to the numerical forecasts - something Lapenta believes can help forecasters increase the accuracy of predictions. "Understanding weather is more than understanding what's happening high in the clouds," he said. "The satellite data takes into account conditions at ground level, where the weather impacts most people."

This method incorporates factors such as variations in the way different land surfaces react to the energy emitted by the sun.

"From prior NASA research, we know that parking lots, which absorb and hold heat, tend to become much hotter during the day than forests, which are cooled by evaporation," he said.

"Also, the amount of water in the top layers of the soil affects how the Sun's energy heats the overlying air. If the soil is wet, more energy is used to evaporate moisture than to heat the land and air. We adjust the initial estimate of moisture availability so that the predicted air temperature follows what the satellite senses. The satellite data helps the model to account for such differences in the temperature of the land surface."

Even though the weather-prediction equations are complex, the concept is quite straightforward. Lapenta's model uses an array of geographic grid points. Using these points, the method starts by creating a "snapshot" of the current state of the atmospheric winds, temperatures, and humidity. The next step is to use mathematical equations to predict the evolution of the atmosphere over the course of 48 hours.

"Many details are factored into the weather-prediction equations," he said. "For example, today's rainfall may become tomorrow's humidity through evaporation from the wet soil."

When all standard factors are calculated into his formulas, there is enough information for an initial forecast, but that's not where it ends. He then adds the satellite data, which makes adjustments to the soil moisture availability at each grid point - this can have a dramatic impact on the original prediction.

Lapenta is concentrating on spring and summer weather, because precipitation during warm-weather seasons has been traditionally more difficult to predict.

In addition to improving the accuracy of short-range (0 to 48-hour) predictions of temperature, humidity, and precipitation, Lapenta's goal is seeing this new method implemented within other models, including those used by the National Weather Service. He also sees potential for using the method to improve urban and air quality modeling.
-end-
This is a joint research project with Dick McNider of the University of Alabama in Huntsville, and supported by Ron Suggs and Gary Jedlovec, NASA scientists in the Global Hydrology and Climate Center, who process the satellite data. All are located at the National Space Science and Technology Center.

A collaboration that enables scientists, engineers and educators to share research and facilities, the NSSTC is a partnership with NASA's Marshall Space Flight Center in Huntsville, Alabama universities and federal agencies. Opened in 2000, it focuses on space science, materials science, biotechnology, Earth sciences, propulsion, information technology, optics and other areas that support NASA's mission.

NASA/Marshall Space Flight Center News Center

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.