More reliably predicting what will work

March 31, 2017

The translation of preclinical research findings into effective treatments continues to deliver unsatisfactory results. When experimental diagnostic and treatment approaches are applied in practice, many of them fail. What are the reasons behind this? A recent study by researchers from Charité - Universitätsmedizin Berlin and the Berlin Institute of Health (BIH) has shown that a more flexible approach to study design can significantly improve the efficiency of preclinical research. Results from this research have been published in the current issue of the journal PLOS Biology.*

The development of new treatment approaches demands reliable and reproducible results from biomedical research. These results must have high predictive ability in a real-world setting, both in terms of our understanding of the disease itself and our understanding of new diagnostic and treatment methods. However, many treatments which appear to achieve promising results in animal studies later fail to produce results in the clinical setting. Reasons for this include a lack of quality control in preclincial studies, which may result in inadequate sample sizes, a lack of randomization, or the use of an inappropriate study design.

Working under the leadership of Prof. Dr. Ulrich Dirnagl, Head of Experimental Neurology at Charité and Founding Director of the Center for Transforming Biomedical Research at the BIH, the researchers showed that, through the use of a more flexible approach involving group sequential designs, it is possible to enhance the efficiency of preclinical studies. Group sequential design studies are quite common in clinical research - a situation which contrasts starkly with that found in preclinical research. Sequential design methods offer the option of designing studies with larger sample sizes and more robust findings. Stopping criteria, which are developed in advance, allow studies to be stopped early if the treatment fails to produce the expected effect, or if the effect size is found to be very large. This results in many studies, which are initially estimated to require large numbers of animals, being stopped early, meaning that not all of the animals need to be used.

"Our computer models show that, without affecting the validity of the study, group sequential designs lead to resource savings of 30% when compared to the block designs commonly used in preclinical studies," explains Dr. Ulrike Grittner, one of the study's two first authors.

Conventional block designs require sample sizes to be predetermined and fixed, with the question of whether or not the null hypothesis should be rejected only answerable at the end of the study. Dr. Grittner adds: "Higher standards of quality in preclinical research make it easier to translate research findings into clinical research. This means that promising new treatments can be spotted sooner, and can be made available to patients more quickly."

Charité - Universitätsmedizin Berlin

Related Biomedical Research Articles from Brightsurf:

General data protection regulation hinders global biomedical research
The European Union (EU) General Data Protection Regulation (GDPR) was designed to give EU citizens greater protection and control of their personal data, particularly when transferred to entities outside the EU.

Novel educational program puts a human face on biomedical research
The goal of translational research is to speed research breakthroughs into clinical practice.

Biomedical research may miss key information by ignoring genetic ancestry
A new study of Black residents of four distinct US cities reveals variations in genetic ancestry and social status that underscore the inadequacy of using skin color as a proxy for race in research.

Advances in cryo-EM materials may aid cancer and biomedical research
Cryogenic-Electron Microscopy (cryo-EM) has been a game changer in the field of medical research, but the substrate, used to freeze and view samples under a microscope, has not advanced much in decades.

World-first program uncovers errors in biomedical research results
Just like the wrong ingredients can spoil a cake, so too can the wrong ingredients spoil the results in biomedical research.

Scientists poised to study reproducibility of Brazilian biomedical research
A project to assess the reproducibility of biomedical research in Brazil has been described today in the open-access journal eLife.

Transparency and reproducibility of biomedical research is improving
New research publishing Nov. 20 in the open-access journal PLOS Biology from Joshua Wallach, Kevin Boyack, and John Ioannidis suggests that progress has been made in key areas of research transparency and reproducibility.

As private funding of biomedical research soars, new risks arise
Academic medical centers (AMCs) in the US are navigating an increasing shift in research funding from historic public funding (e.g., NIH) to private sources such as pharma and biotech companies, foundations, and charities, raising a host of new issues related to collaborative research models, intellectual property rights, and scientific and ethical oversight.

BGRF scientists co-publish research paper on blockchain & AI for biomedical applications
Biogerontology Research Foundation Chief Science Officer (CSO) co-authored the landmark paper in the journal Oncotarget on the convergence of blockchain and AI to decentralize and galvanize healthcare and biomedical research.

Promising new drug for Hep B tested at Texas Biomedical Research Institute
Research at the Southwest National Primate Research Center (SNPRC) on the campus of Texas Biomedical Research Institute helped advance a new treatment now in human trials for chronic hepatitis B virus (HBV) infection.

Read More: Biomedical Research News and Biomedical Research Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to