Better weather predictions in an avalanche of data

October 23, 2002

COLLEGE STATION, October 23, 2002 - Sometimes getting too much of a good thing may create more problems than not getting enough - especially when it comes to the weather. Just ask Texas A&M University atmospheric scientist Fuqing Zhang, whose ensemble weather forecasting research is burdened with trillions of bytes of real-time data.

Zhang's quest, funded by a National Science Foundation grant of $295,500, is to find the best way to assimilate the most recent weather observation data for input into the latest computer forecasting models.

"Right now, we have good computer programs to help us forecast tomorrow's weather," Zhang said. "For example, the official U.S. weather forecast, issued by the National Center for Environmental Protection (NCEP), part of the National Oceanographic and Atmospheric Agency (NOAA), is completely computer generated, untouched, as it were, by human hands.

"The problem is that we have overwhelming amounts of data to put into such models," he continued. "We receive numbers on wind, water, temperature from surface weather stations, weather balloons, national Doppler radar coverage and satellites at rates that vary from minutes to hours to days. All this data is hard to integrate for computer input because it varies according to the different spatial, geographic and temporal scales over which it was collected. In addition, many of the measurements are indirect indicators of physical conditions.

"So, we need to come up with better ways to digest all this data in order to have immediate impacts on our daily weather predictions."

Zhang and his team of collaborators from NOAA, the National Center for Atmospheric Research (NCAR) and the University of Washington (Seattle) are hoping to help forecasting computers' data digestion processes through use of innovative statistical techniques permitting ensemble-based data assimilation.

"Ensemble-based data assimilation focuses on better ways to incorporate the uncertainties surrounding both yesterday's forecast and today's observations," Zhang said. "We sample the ways in which the previous day's forecast deviated from what really happened, and we sample the wealth of data available to us from the present 12 hour period. Then we use statistics to get the best estimate of current initial conditions for the computer forecasting models, which predict tomorrow's weather.

"Even given the problems of data sampling and uncertainty, new generation numerical weather prediction via computer simulations significantly outperforms human forecasters," he continued. "Now, innovative data assimilation techniques will not only take full advantage of current weather observations to make better daily weather forecasts, it will also provide guidance in designing next-generation weather observation networks."
-end-


Texas A&M University

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.