Analyzing options for increasing affordability of flood insurance

December 11, 2015

WASHINGTON -- A new congressionally mandated report from the National Academies of Sciences, Engineering, and Medicine identifies an approach for the Federal Emergency Management Agency (FEMA) to evaluate policy options for making premiums through the National Flood Insurance Program (NFIP) more affordable for those who have limited ability to pay.

Microsimulation is a modeling approach that is well-suited to estimating premiums and future flood damage claims at the individual policyholder level, the report says. A microsimulation modeling approach would, for example, allow FEMA to compare the price of NFIP premiums that reflect true flood risk -- as called for in the Biggert-Waters Flood Insurance Reform Act of 2012 -- with measures of policyholders' ability to pay. The agency then could evaluate how different premium and mitigation assistance programs might be designed to make premiums affordable for cost-burdened households.

FEMA currently does not have the policy analysis capacity or necessary data to comprehensively analyze different options for making flood insurance more affordable, the report says. For example, the NFIP's database lacks first-floor elevation data for many properties, making it difficult to estimate those properties' risk of flood damage or the premiums they would face under a risk-based pricing structure. Moreover, the NFIP database does not contain data on policyholders' income, wealth, or housing costs. Because of this lack of data, FEMA cannot analyze the likely impact of federal assistance programs that consider such factors.

The report identifies some limited analyses FEMA can do now, and describes ways that the agency can build its modeling capacity and data resources to enable such analyses. The pace at which FEMA's modeling capacity grows will depend on the resources available, access to appropriate expertise, and the support of agency leadership, the report says.

The committee that wrote the report also included further findings on topics examined in its March 2015 report, which discussed how NFIP premiums are set and the changes called for by the Biggert-Waters Act and the subsequent Homeowner Flood Insurance Affordability Act (HFIAA) of 2014. Biggert-Waters eliminated grandfathering - which had allowed policyholders to maintain their premium rates even if their property was subsequently mapped into a higher flood-risk zone -- but HFIAA re-instated it. The new report describes how this reinstatement will limit the ability of the NFIP to employ risk-based pricing over time if climate change, land development, and more accurate mapping place more properties in higher flood risk zones. Other new findings focus on setting of premiums outside the 100-year floodplain, how the NFIP calculates the cost burden that premiums place on policyholders, the effects of risk-based pricing on property values, and the linking of flood insurance premium assistance with mitigation efforts.
The study was sponsored by FEMA. The National Academies of Sciences, Engineering, and Medicine are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. The Academies operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln. For more information, visit A committee roster follows.

Sara Frueh, Media Relations Officer
Grace Minus, Media Assistant
Office of News and Public Information
202-334-2138; e-mail
Twitter: @theNASEM
RSS feed:

Copies of Affordability of National Flood Insurance Program Premiums - Report 2 are available at Reporters may obtain a copy from the Office of News and Public Information (contacts listed above)

Division on Earth and Life Studies
Water Science and Technology Board

Committee on Affordability of National Flood Insurance Premiums - Report 2

Leonard A. Shabman (chair)
Resident Scholar
Resources for the Future
Washington D.C.

Sudipto Banerjee
Professor and Chair
Department of Biostatistics
Fielding School of Public Health
University of California
Los Angeles

John J. Boland
Professor Emeritus
Department of Geography and Environmental Engineering
Johns Hopkins University

Patrick L. Brockett
Gus S. Wortham Memorial Chair in Risk Management and Insurance
Department of Management Science and Information Systems
Red McCombs School of Business
University of Texas

Raymond J. Burby
Professor Emeritus
Department of City and Regional Planning
University of North Carolina
Chapel Hill

Scott Edelman
Senior Vice President
Greensboro, N.C.

Michael Hanemann*
Professor of the Graduate School
University of California
Berkeley; and
Wrigley Chair in Sustainability
Department of Economics
Arizona State University

Carolyn Kousky
Resources for the Future
Washington, D.C.

Howard C. Kunreuther
Professor of Decision Sciences and Business and Public Policy, and
Risk Management and Decision Resources Center
Wharton School of Business
University of Pennsylvania

Shirley Laska
Professor Emerita of Sociology, and
Founding Past Director
University of New Orleans
New Orleans

David R. Maidment
Hussein M. Alharthy Centennial Chair of Civil Engineering, and
Center for Research in Water Resources
University of Texas

David I. Maurstad
Director and Senior Vice President
OST Inc.
McLean, Va.

Allen L. Schirm
Director of Methods and Senior Fellow
Mathematica Policy Research
Washington, D.C.

Ed Dunne
Study Director

*Member, National Academy of Sciences

National Academies of Sciences, Engineering, and Medicine

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to