Lower-quality studies often used to support changes to high-risk medical devices
Clinical studies supporting FDA approval of high-risk medical device modifications were found to be methodologically poor, with many lacking randomization and blinding.
Articles tagged with Data Points
Clinical studies supporting FDA approval of high-risk medical device modifications were found to be methodologically poor, with many lacking randomization and blinding.
Researchers analyzed ancient Roman skulls using state-of-the-art forensic techniques to identify significant cranial differences between coastal communities. The findings suggest that the area around Velia had a large Greek population, influencing local physical characteristics.
Researchers from TUM develop new algorithm to reconstruct 4D point clouds of cities from satellite images, enabling early detection of dangers like subsidence and collapse. The method improves precision on a fraction of the radar wavelength, allowing for monitoring of urban growth and development.
The art market price index, compiled by Prof. Roman Kräussl, shows a significant decline in post-war and contemporary art prices, with a 21% drop in 2016. This confirms a burst of the market bubble, which had been growing since 2009.
Researchers use machine-compiled data to construct a microbial supertree, addressing the challenge of accessing and synthesizing scientific literature. The study demonstrates the potential of high-throughput machine extraction in tackling the vast amount of published science.
Researchers found that nighttime illumination can accurately predict wealth levels in regions, including individual settlements. Satellite data on night light emissions was used to make inferences about local economic rank and compare wealth levels across countries.
Researchers at Stanford University have created a deep learning algorithm that can accurately predict the toxicity of chemicals and associate drugs with side effects using just six data points. This breakthrough could help chemists choose promising candidates and accelerate drug development.
Researchers from MIT's CSAIL develop a new system called Flowtune that allocates network bandwidth more fairly in data centers, reducing lag and improving page load speeds. In tests, Flowtune completed the slowest 1% of data requests nine to 11 times as rapidly as existing systems.
A new study reveals that the level of lumbar spine fusion significantly influences the risk of developing proximal junctional kyphosis (PJK). Fusing lower portions of the lumbar spine decreases PJK risk, while more dramatic spinal fusions may increase it.
A new study found that 20% of homeowners who could have benefited from refinancing didn't, resulting in thousands of dollars paid over the life of the loan. The researchers suggest that behaviors like inattention and procrastination are preventing people from taking advantage of long-term benefits.
Experts warn that routine screening may lead to overdiagnosis and inappropriate treatment. The US Preventive Services Task Force's liberal recommendations have sparked criticism, with some experts questioning the validity of the research methodologies used.
Researchers have developed Vizier, an open-source software to 'clean' big data, making it easier for users to explore, visualize, and communicate insights. The tool is designed for massive datasets, allowing users to spot errors and offer solutions.
A team of geographers has developed a methodology to measure aboveground woody biomass in savannahs, with results showing significant changes in the ecosystem. The study found that some areas of Kruger National Park's savannah are experiencing declining biomass due to elephant activity, not human impact.
A review in Angewandte Chemie emphasizes the importance of standardized analytic protocols to accurately assess microplastic contamination in aquatic ecosystems. The authors propose a list of nine arguments for harmonized methods, including improved sampling and processing techniques, to ensure reliable data and inform risk assessments.
Researchers developed an algorithm that extracts diverse subsets from large data sets, improving machine learning efficiency and diversity. The new method is 1 billion times faster than existing algorithms, enabling real-time analysis of vast amounts of data.
Researchers from MIT develop a new coreset-generation technique to shrink large data sets, preserving fundamental mathematical relationships. The technique can reduce massive matrices to smaller sizes, enabling faster computations and improved analysis.
A new report by University of Illinois professors recommends standardizing sharing of data sets and software code to improve reproducibility in computational research. The recommendations include documenting digital objects, making datasets and workflows accessible, and implementing reproducibility checks before publication.
A team from North Carolina State University has created an efficient method for analyzing data from wearables to track physical activity. By adjusting the time increment used to assess activity, they were able to accurately identify five activities (golfing, biking, walking, waving, and sitting) while reducing power consumption.
Researchers have developed a new algorithm that can efficiently fit probability distributions to high-dimensional data, even when the dataset contains corrupted entries. The algorithm relies on two insights: selecting an appropriate metric for measuring distance from distributions and identifying regions where cross-sections should begin.
Researchers at MIT develop algorithms that automate key steps in big-data analysis, reducing time from months to days. The new approaches enable faster data preparation and problem specification, allowing data scientists to produce value quickly.
A study by University of Michigan professor Daniel Romero found that matching an opponent's linguistic style in presidential debates can lead to a bump in polls. Function words like conjunctions and quantifiers play a crucial role in this phenomenon, which is linked to processing fluency and easier understanding for third-party viewers.
Researchers at Ohio State University developed a mathematical model of drinking behavior in college students, revealing that they adjust their consumption to maintain a certain level of drunkenness. The study aims to provide insights into high-risk alcohol consumption and inform interventions.
The University of Arizona's INSITE Center helps Fortaleza city planners use big data to understand bus delays and overcrowding. The team developed an online dashboard with real-time data on bus movements, passenger numbers, and weather conditions to inform decision-making.
Scientists found that sections of the West Napa Fault continued to slip after the primary earthquake, posing additional hazards to infrastructure. The afterslip caused certain areas to shift by as much as 40 centimeters in the month following the main earthquake.
Researchers discovered 'ecosystem canaries' can provide early warning signs of large, potentially catastrophic, changes in ecosystems. These species often vanish first from a stressed ecosystem, serving as an indicator of an approaching tipping point.
Researchers used data on taxi trip records and points of interest from FourSquare to gather insights into neighborhood crime rates. The study found that areas with nightclubs tend to be low-crime areas, suggesting people choose safe destinations.
A Big Data study has identified decreased blood flow in the brain as the earliest physiological sign of Alzheimer's disease. The researchers analyzed over 7,700 brain images and found that changes in cognition begin earlier than previously believed.
A new study by Yale University researchers found that high tobacco surcharges under the Affordable Care Act reduced smokers' insurance enrollment rates, contrary to the ACA's goal of universal coverage. The study also showed no increase in smoking cessation among smokers facing these surcharges.
A new global research project, Your DNA Your Say, aims to gather opinions from across the world on sharing genetic information. The survey, launched by the Global Alliance for Genomics and Health (GA4GH), seeks to understand public attitudes on data sharing, its use in genomic medicine, and potential harms.
A new study by Dr Natacha Postel-Vinay shows that banks in 1920s Chicago, which experienced a severe real estate boom and bust, were more likely to fail due to their heavy investment in mortgages. The research suggests that today's banks should be cautious not to over-invest in real estate loans.
Researchers have developed a method called 'LIGR-Seq' to explore the functions of non-coding RNAs in human cells. The study revealed new roles for small nucleolar RNAs in regulating protein-coding mRNA stability and abundance.
A recent Iowa State University study found no evidence that grit improves performance, contradicting popular views on the trait. The research analyzed over 67,000 people and concluded that grit is not a unique predictor of success.
A study from Johns Hopkins Medicine found that common measures of hospital safety do not accurately capture quality care, with only one out of 21 measures meeting scientific criteria. The researchers argue that patients deserve valid measures of quality and safety, and call for public rating systems to use clinical rather than billing ...
Researchers at NC State developed a model that expedites design of MIMO antennas, reducing calculation time from days to minutes. The model identifies efficient configurations for antenna designs, saving designers an enormous amount of time.
Mount Sinai researchers found that testing service and time of collection significantly influenced results, with some test providers flagging more abnormal readings than others. The study highlights the importance of accurate lab test results in medical decision-making.
Researchers at MIT and Harvard University have developed a new cryptographic system called Sieve that allows users to store and manage their personal data securely. With Sieve, users can control which apps have access to their data and revoke access with ease.
The Wrangler Supercomputer utilizes 600 terabytes of flash memory to process massive datasets, enabling scientists to analyze thousands of files quickly. This allows researchers to explore new questions and drive previously unattainable discoveries in fields such as gene analysis and building energy efficiency.
Researchers are being called upon to share data from a landmark trial that raised concerns about the use of starch solutions in critically ill patients. The CHEST trial, published in 2012, led to regulators suspending use of hydroxyethyl starch (HES) products worldwide.
A Penn study found that using machine learning forecasts at arraignments can dramatically reduce subsequent domestic violence arrests. Machine learning analyzes data points like age, gender, and prior warrants to determine risk, offering extra information to courts.
The Autumn Symposium 2015 presented diverse perspectives on the use of real-world data in benefit assessments, highlighting its limitations and potential drawbacks. The consensus among speakers was that non-RCT data is not equivalent to 'real world data' and should not be used as a substitute for rigorous clinical trials.
Researchers at UTA are working with North Central Texas Council of Governments to determine the optimal toll price for managed lanes. They will use survey data and video traffic volume data to make recommendations about toll charges that encourage efficient use without increasing congestion, enabling speed of at least 50 mph.
A 15-year study found that high summer air temperatures have a significant impact on Eastern brook trout populations, particularly on the smallest fry and eggs. The researchers predict that if climate warming proceeds as projected, these fish could become extinct in 15 years unless they evolve to adapt.
MIT researchers developed a new algorithm that reduces computation time for complex models by 200 times, using probability distributions and relevant data. The 'shrinking bull's-eye' algorithm can apply to various fields, including engineering, geophysics, and subsurface modeling.
Two new risk prediction tools can identify patients with diabetes who are at high risk of serious complications, such as blindness and amputation. The tools use variables that patients are likely to know or that are routinely recorded in general practice computer systems.
Researchers developed NOMAD, a novel way to analyze bigger datasets using supercomputers and machine learning algorithms, achieving superlinear speedup on large-scale data. This breakthrough enables efficient processing of massive networks, predicting user preferences and identifying key topics in billions of documents.
Experts believe video recording can increase accountability and improve quality through increased transparency. This approach has successfully reduced speeding vehicles and improved hand washing rates in hospitals.
Data from New Horizons' flyby of Pluto suggest the dwarf planet has been frequently resurfaced by erosion or crustal recycling. The study also reveals large regions of differing brightness on Pluto's surface, carved out by structures similar to terrestrial glaciers.
Experts' advice is often compromised by subjective influences and biases, warn researchers. They suggest integrating expert judgements with evidence-based approaches to improve accuracy and reliability.
The Pan-European Species-directories Infrastructure (PESI) project provides a harmonized taxonomic reference system and high-quality data sets to efficiently handle big taxonomic data. It integrates five pillars, including knowledge, consensus, standards, data, and dissemination, to standardize taxonomic information across platforms.
A new study reveals how DNA barcoding can accelerate biodiversity inventories by using rapid publishing techniques and surveying nature reserves in just four months. The results revealed abundant biodiversity in previously understudied taxa, including the addition of 181 new spider species to the inventory list.
A new theoretical analysis by MIT researchers demonstrates how their compression techniques can expand applications of accelerated searching in biology and other fields. The algorithms cluster similar genomic sequences, then choose one representative sequence to focus on, significantly reducing the search time.
The NASA satellite camera has returned its first view of the entire sunlit side of Earth, generating color images using a combination of narrowband filters. The images show effects of sunlight scattered by air molecules, giving them a characteristic bluish tint.
A 30-year study of 4,941 thyroid cancer patients found that moderate TSH suppression can bring benefits comparable to more extreme approaches, with potentially fewer side effects. The research supports a shift towards more tailored hormone suppression strategies for high-risk patients.
A recent study published in Conservation Biology makes a strong case for using detailed wildlife density data to inform conservation planning. The research found that prioritizing areas with higher species richness led to the protection of more individuals of each species.
Researchers combined local bird survey data from 16 point counts in Northeast and Midwest forests, revealing significant declines in populations of four ecological indicator species. The study highlights the need for more formal monitoring in rare habitats and suggests that some species may be increasing in certain regions.
Despite significant genetic mutations, Ebola's functional level has remained the same over the past four decades. This suggests that vaccines and treatments developed during current outbreaks may be effective against future outbreaks.
The study found that both faults are connected, allowing for a rupture to propagate across them, resulting in a significantly more destructive earthquake. The team used satellite data to map ground deformation and measure creep along the southern end of the Hayward Fault.
Experts analyze data from a national survey to question the feasibility of party promises on accessing general practice. The analysis highlights that guarantees of appointments within 48 hours may be unrealistic, and recruiting 5,000 more GPs could prove challenging.
Researchers have developed a new method to store large volumes of data using DNA and silica, which can potentially survive for over a million years. The technique uses an algorithm to correct errors and encases the information-bearing segments of DNA in silica, providing a robust storage solution.
MIT researchers report that just four pieces of information are enough to identify 90% of people in a data set. Adding coarse-grained price information reduces the number of data points needed to reach 94%. The study highlights the risks of re-identification and encourages socially beneficial uses of big data.