Scientists, Philosophers Discuss How To Evaluate Data, Error

August 03, 1998

BLACKSBURG, Va.--Ecologists and policy makers make recommendations and decisions with important, long-term social and environmental impacts based upon the results of research done in dynamic environments where findings have to be interpreted in the face of many variables, probabilities and unknowns. "And there are deep philosophical disagreements among ecologists and other scientists about how to quantify the uncertainty, and even about how to even interpret the evidential meaning of the data we collect, to determine what the data are saying, such as about ecological impacts of various technologies," says Virginia Tech philosophy professor Deborah Mayo.

At the 83rd annual Ecological Society of America meeting at the Baltimore Convention Center Aug. 2-6, Mayo will discuss her proposed framework for learning from evidence in the face of error. Her presentation is Monday, Aug. 3, at 3;10 p.m. as part of the "Nature of Scientific Evidence" symposium (1-5 p.m., room 316).

The symposium will include representatives from applied fields, statistics, and philosophy of science who will discuss and debate their different positions about the most fruitful and most reliable methods to use in making scientific inferences in ecology. "We will debate such issues as how to interpret statistical tests, how to get objective inferences or whether to rely on subjective assessments by experts, what is the best way to quantify uncertainty, and how should uncertain assessments be reported to policy makers or the general public," Mayo explains.

"But the issue is not just how much risk of error should we allow in making policy decisions? The issue is also the deeper one: How do we even evaluate and quantify the risk of error using the data we are able to collect?" she says.

"That is why the ecologists are not just grappling with policy issues, such as how conservative to be when one's data analysis will affect the environment, but they are also being drawn into philosophical issues about scientific data and the nature of scientific inference and scientific knowledge," Mayo says. "There are competing methods for interpreting data and each may yield very different assessments of what the risks are in the first place. That is why this session of ours at the ESA is so fascinating and is truly interdisciplinary. Philosophers of science whose business it is to understand and explain scientific data and uncertain inference are being called upon to reflect together with statisticians, biostatisticians, ecologists, and others to explore the most fundamental issues about how to interpret, quantify, and base decisions upon data."

Mayo's own "error statistical philosophy of evidence" is based on learning from mistakes by using familiar statistical hypotheses to recognize common types of experimental mistakes, then developing rules and techniques for uncovering and avoiding errors. "I propose reinterpreting standard statistical tests as tools for obtaining experimental knowledge," she says.

"I represent an orthodox approach with a lot of reinterpretation aimed deliberately at avoiding common misuses and criticisms." But the value of the symposium for her go beyond promoting her own philosophy, she adds. "My goal is to learn much more about the issues from the scientists at the conference."

For more information contact symposium organizer Nicholas Lewin, GIS Support and Research Facility, Iowa State University,, 515-294-2279.

Virginia Tech

Related Decisions Articles from Brightsurf:

Consumers value difficult decisions over easy choices
In a paper co-authored by Gaurav Jain, an assistant professor of marketing in the Lally School of Management at Rensselaer, researchers found that disfluency, or the difficulty for an individual to process a message, increases people's attitudes toward that message after a time delay.

Evolutionary theory of economic decisions
When survival over generations is the end game, researchers say it makes sense to undervalue long shots that could be profitable and overestimate the likelihood of rare bad outcomes.

Decisions made for incapacitated patients often not what families want
Researchers from Regenstrief Institute and Indiana University report in a study published in JAMA Network Open that nearly half of the time medical treatments and orders received for incapacitated patients were not compatible with goals of care requested by their surrogate decision makers.

Which COVID-19 models should we use to make policy decisions?
A new process to harness multiple disease models for outbreak management has been developed by an international team of researchers.

For complex decisions, narrow them down to two
When choosing between multiple alternatives, people usually focus their attention on the two most promising options.

Fungal decisions can affect climate
Research shows fungi may slow climate change by storing more carbon.

How decisions unfold in a zebrafish brain
Researchers were able to track the activity of each neuron in the entire brain of zebrafish larvae and reconstruct the unfolding of neuronal events as the animals repeatedly made 'left or right' choices in a behavioral experiment.

Best of the best: Who makes the most accurate decisions in expert groups?
New method predicts accuracy on the basis of similarity.

How do brains remember decisions?
Mammal brains -- including those of humans -- store and recall impressive amounts of information based on our good and bad decisions and interactions in an ever-changing world.

How we make complex decisions
MIT neuroscientists have identified a brain circuit that helps break complex decisions down into smaller pieces.

Read More: Decisions News and Decisions Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to