Researchers developed a multi-fidelity graph network approach to predict material properties with improved accuracy, enabling predictions for disordered materials. The new method reduced mean absolute errors by 22-45% compared to traditional approaches.
Researchers at UCLA have developed Diffractive Deep Neural Networks (D2NNs) for all-optical object classification, achieving higher accuracy than individual constituent D2NNs and digital AI models. The success of the ensemble learning approach demonstrates the power of combining multiple predictions to obtain a more accurate prediction.
New research identifies a risk of collusive pricing behavior when AI algorithms set prices without observing competitor prices. This can lead to monopolistic price effects and supracompetitive market outcomes. The study suggests that independent AI algorithms can result in these negative consequences.
Apple iPhone 17 Pro
Apple iPhone 17 Pro delivers top performance and advanced cameras for field documentation, data collection, and secure research communications.
The University of Texas at San Antonio's MATRIX AI Consortium has received over $1 million in research funding to develop novel brain-inspired lifelong learning algorithms. Inspired by the honeybee brain, these algorithms aim to close the performance gap between modern AI systems and biological systems.
A joint research team developed DeepTFactor, a deep neural network predicting transcription factors from protein sequences. The tool uses three parallel convolutional neural networks and predicted 332 transcription factors of Escherichia coli K-12 MG1655.
A new study published in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism uses machine learning to predict gestational diabetes in Chinese women. The researchers analyzed nearly 17,000 electronic health records and found that low body mass was associated with an increased risk of gestational diabetes.
A new AI-powered microscope can rapidly image large tissue sections with cellular resolution, potentially during surgery, to find the answer. The DeepDOF microscope uses deep learning to train a computer algorithm to optimize image collection and post-processing.
GoPro HERO13 Black
GoPro HERO13 Black records stabilized 5.3K video for instrument deployments, field notes, and outreach, even in harsh weather and underwater conditions.
A new machine learning algorithm has classified over 2,300 supernovae with an accuracy rate of 82%, using real data from the Pan-STARRS1 Medium Deep Survey. The classifier was trained on a subset of supernovae with spectra and then applied to the remaining data, achieving high accuracy rates.
Researchers at the Salk Institute have created a computational model of brain activity that simulates how humans adapt to new situations. The model, which incorporates the concept of 'gating' to control information flow, outperforms previous models and mimics human mistakes seen in patients with prefrontal cortex damage. This breakthro...
The hexxed app, a mobile game, helps researchers compare human vs machine problem-solving strategies. Players must figure out rules on the fly, resembling AI's approach, allowing scientists to create a benchmark for human intelligence.
Computer simulation is crucial for developing human-interactive smart robots, enabling safer, faster, and more efficient design and control. By analyzing the biology of soft animal structures, researchers can construct virtual proving grounds to understand robot behavior and optimize performance.
Researchers developed a new method combining label-free imaging with artificial intelligence to study live cells over time. The technique allows for the estimation of cell attributes without using toxic fluorescent dyes.
SAMSUNG T9 Portable SSD 2TB
SAMSUNG T9 Portable SSD 2TB transfers large imagery and model outputs quickly between field laptops, lab workstations, and secure archives.
A new machine learning approach offers important insights into catalysis by providing a tool to design efficient catalytic processes. The Bayeschem model explains how catalysts interact with different intermediates and determines the optimal bond strengths.
A new machine learning algorithm, TranSEC, uses traffic datasets from UBER drivers and publicly available sensor data to map street-level traffic flow over time. This creates a big picture of city traffic, allowing for near-real-time analysis and predictive modeling.
An AI model developed at Duke University successfully identified patients with Alzheimer's disease from retinal images, suggesting its potential as a predictive tool. The study provides proof-of-concept for machine learning analysis of certain types of retinal images to detect the neurological disease in symptomatic individuals.
Insilico Medicine introduces Molecular Sets (MOSES), a benchmarking platform for generative chemistry models, enabling easy comparison and evaluation of new models against existing approaches. The platform provides a curated dataset, metrics, and baselines for assessing model performance.
A new study examines the 'word-of-machine' effect, where consumer preference for AI recommenders is influenced by the importance of utilitarian versus hedonic attributes. When utilitarian features are emphasized, consumers prefer AI over human assistance, while hedonic features lead to a preference for humans.
Apple iPad Pro 11-inch (M4)
Apple iPad Pro 11-inch (M4) runs demanding GIS, imaging, and annotation workflows on the go for surveys, briefings, and lab notebooks.
Researchers exploring the nature of AI failures reveal 'adversarial examples' may not be intentional mistakes. Instead, they might be 'artifacts' created by interactions between network and data patterns. This rethink suggests that misfires could offer useful information if interpreted correctly.
Researchers developed an AI-powered method to detect Parkinson's disease from retinal images, identifying smaller blood vessels as key features. The approach is less costly and more accessible than traditional brain imaging techniques.
KAUST researchers developed a holistic approach using design of experiments and machine learning to identify the greenest method for producing a popular metal organic framework material called ZIF-8. This process reduced waste and energy consumption by optimizing multiple variables simultaneously.
USC researchers have developed a system that lets robots autonomously learn complicated tasks from a very small number of imperfect demonstrations. The system uses signal temporal logic to evaluate the quality of each demonstration, allowing robots to learn more intuitively and adapt to human preferences.
Sony Alpha a7 IV (Body Only)
Sony Alpha a7 IV (Body Only) delivers reliable low-light performance and rugged build for astrophotography, lab documentation, and field expeditions.
A team of researchers has introduced Deep Potential Molecular Dynamics (DPMD), a new machine learning-based protocol that can simulate over 100 million atoms per day. The protocol achieves ab initio accuracy and was recognized with the ACM Gordon Bell Prize for its achievement in high-performance computing.
Researchers developed a new electronic chip that brings together imaging, processing, machine learning, and memory in one device powered by light. The prototype achieves brain-like functionality and can be used to enable smarter and smaller autonomous technologies like drones and robotics.
Researchers developed machine learning frameworks that guarantee robots' performance in unfamiliar settings, with a guaranteed success rate of 88.4% in obstacle avoidance trials. The approach expands generalization theory to robotics, providing more broadly applicable guarantees on robot control policies.
Researchers from Bar-Ilan University demonstrate the application of physical concepts in physics to solve key challenges in artificial intelligence. By adopting power-law scaling, they show that learning each example once is equivalent to learning examples repeatedly, enabling rapid decision-making and ultrafast learning.
The VEDLIoT project is developing a new generation of IoT platforms that use machine learning to improve the performance and energy efficiency of devices. The platform aims to enable autonomous vehicles, smart homes, and industrial applications to learn and adapt to their environments.
Davis Instruments Vantage Pro2 Weather Station
Davis Instruments Vantage Pro2 Weather Station offers research-grade local weather data for networked stations, campuses, and community observatories.
Researchers created a new type of machine learning model to predict power-conversion efficiency of materials for next-generation organic solar cells. The approach is quick and easy to use, providing important data on chemical fragments that affect performance.
Researchers used machine learning to predict water stability in metal-organic frameworks (MOFs), accelerating the development of new materials. The model, trained on over 200 existing MOFs, enables predictions for other important properties, expanding applications in chemical separations, adsorption, and sensing.
Georgia State University researchers have developed a software tool called COINSTAC, which allows for local data analysis and sharing of results, protecting patient privacy and anonymity. The platform is designed to be compatible with deep learning models, enabling the training of analyses on decentralized databases.
Kestrel 3000 Pocket Weather Meter
Kestrel 3000 Pocket Weather Meter measures wind, temperature, and humidity in real time for site assessments, aviation checks, and safety briefings.
A group of Skoltech scientists created machine learning algorithms that can predict oil viscosity based on nuclear magnetic resonance (NMR) data. This method has the potential to revolutionize the way oil is processed and understood.
Researchers study insect brain computation to develop faster, more efficient machine learning algorithms. The fruit fly's ability to generalize experiences in complex environments informs the creation of an AI model capable of rapid adaptation.
A new research project at Aarhus University will use Bayesian Neural Networks to model, analyse and understand trading behaviour in the world of finance. This technology is expected to improve our understanding of financial data by providing probability distributions based on complex data.
Scientists developed an AI tool called DeepFRET that analyzes protein motion and interaction, speeding up research and making it accessible to more labs. The tool's accuracy exceeds 95%, outperforming human operators and requiring minimal human input.
Aranet4 Home CO2 Monitor
Aranet4 Home CO2 Monitor tracks ventilation quality in labs, classrooms, and conference rooms with long battery life and clear e-ink readouts.
A research team at POSTECH developed a machine learning technique that uses transcriptome information from artificial organoids to predict anti-cancer drug response. This method increases predictive accuracy by selecting only relevant biomarkers, which was previously limited by false signals in conventional machine learning.
ESEC/FSE 2020, a leading software engineering conference, presents outstanding research results and trends in virtual format. The conference features keynote addresses on human-computer collaboration, diversity, and inclusion initiatives.
A machine-learning algorithm detects early stages of Alzheimer's disease using functional magnetic resonance imaging (fMRI). The algorithm uses a convolutional neural network (CNN) to analyze fMRI data and classify patients as healthy, mild cognitive impairment, or Alzheimer's disease.
A new AI-based tool can help clinicians predict which hospitalized patients are at high risk of developing acute kidney injury, allowing for earlier treatment and potentially better outcomes. The Dascena algorithm outperformed the standard method in predicting AKI 72 hours prior to onset.
Rigol DP832 Triple-Output Bench Power Supply
Rigol DP832 Triple-Output Bench Power Supply powers sensors, microcontrollers, and test circuits with programmable rails and stable outputs.
Researchers from FAU's College of Engineering and Computer Science are developing new theory and methods to curate training data sets for AI learning and screen real-time operational data for safe field deployment. The project aims to identify faulty, unusual and irregular information for AI operations.
Purdue University is leading a $3.7 million research project to create more secure machine learning algorithms for autonomous systems. The goal is to develop a robust, distributed and usable software suite to prevent AI hacking and ensure the accuracy of autonomous machines on the battlefield.
A new AI-based method has been developed to detect small, imperceptibly tiny earthquakes that occur on the same faults as bigger earthquakes. This technology could provide insights into how earthquakes interact and spread out along the fault, allowing for a clearer view of earthquake patterns.
Researchers at the University of Chicago have developed a new method to design polymers with specific properties, such as degradable plastic bags or super-strong aircraft materials. By combining modeling and machine learning, they created a large database of hypothetical polymers and trained a neural network to predict their properties.
Researchers developed new AI models inspired by nature, reducing complexity and enhancing interpretability. These models can control vehicles with just a few artificial neurons, outperforming previous deep learning models in tasks such as autonomous lane keeping.
Apple AirPods Pro (2nd Generation, USB-C)
Apple AirPods Pro (2nd Generation, USB-C) provide clear calls and strong noise reduction for interviews, conferences, and noisy field environments.
A new deep learning model inspired by tiny animals has shown decisive advantages over previous models in tasks such as autonomous driving. The model achieves better performance with fewer neurons and is more interpretable than complex 'black box' systems.
Researchers have developed machine learning algorithms to predict which RNA-based toehold switches function well, enabling the identification and optimization of these tools. The algorithms analyzed a massive dataset of over 100,000 toehold switch sequences and predicted their behavior with high accuracy.
Machine learning transforms traditional science assessment by tapping into complex constructs, improving functionality and facilitating automatic scoring. The technology is expected to redefine science assessment practices and change the future of education.
Apple MacBook Pro 14-inch (M4 Pro)
Apple MacBook Pro 14-inch (M4 Pro) powers local ML workloads, large datasets, and multi-display analysis for field and lab teams.
The inaugural ACM International Conference on AI in Finance (ICAIF) will explore how artificial intelligence is transforming the finance industry. The conference features a dynamic program of work from top researchers, including papers on machine learning methods to detect money laundering and algorithms in future capital markets.
The ACM International Conference on AI in Finance (ICAIF) will explore the effects of AI on the finance world through a dynamic program of work from top researchers. Key topics include machine learning methods for detecting money laundering and algorithms for future capital markets.
Researchers created machine learning models to predict patients' post-surgical pain and opioid needs. The models accurately identified high-risk patients 80% of the time, suggesting more effective pain management strategies with non-opioid alternatives.
Researchers at Cold Spring Harbor Laboratory have developed an AI tool that can efficiently recognize neurons in microscope images, significantly improving the accuracy of automated tracing and analysis. This breakthrough aims to untangle the mysteries of brain connectivity and enable humans to think about how brains work.
Researchers are developing a novel deep learning technique to identify relationships between brain networks and Alzheimer's disease using algorithms mimicking neural networks. The goal is to pinpoint specific areas in the brain to slow and treat disease progression.
GQ GMC-500Plus Geiger Counter
GQ GMC-500Plus Geiger Counter logs beta, gamma, and X-ray levels for environmental monitoring, training labs, and safety demonstrations.
Berkeley Lab scientists develop a tool using machine learning algorithms to guide synthetic biology development systematically. The Automated Recommendation Tool (ART) predicts how changes in a cell's DNA or biochemistry will affect its behavior and recommends the next engineering cycle.
Swiss Center for Electronics and Microtechnology engineers developed an approach to overcome the initial trial-and-error phase of reinforcement learning. This allows computers to quickly find the right path without extreme fluctuations, slashing energy use by over 20% in complex systems.
The DOE has awarded $2.16 million to two physicists at Jefferson Lab for AI-assisted experiment control and calibration, as well as improved SRF operation at the CEBAF accelerator facility. These projects aim to optimize operations and generate better-quality data, potentially shaving off months of research labor.
A team led by Lydia Kavraki used machine learning to predict scaffold material quality, controlling print speed is critical in making high-quality implants. The collaboration could lead to better ways to quickly print customized implants.
Apple Watch Series 11 (GPS, 46mm)
Apple Watch Series 11 (GPS, 46mm) tracks health metrics and safety alerts during long observing sessions, fieldwork, and remote expeditions.
Researchers at UMass Amherst and Baylor College of Medicine developed a new method to protect deep neural networks from catastrophic forgetting, inspired by the brain's 'replay' ability. The method, called generative replay, generates high-level representations of previously seen data, preventing the network from forgetting earlier lea...
Researchers used AI to analyze stool samples from over 1,000 individuals, identifying clusters of bacteria that could indicate existing or non-existent cardiovascular disease. The study suggests fecal microbiota composition could serve as a convenient diagnostic screening method for CVD.
Researchers at University of California San Diego use artificial intelligence to identify a DNA activation code called the downstream core promoter region (DPR) that's used as frequently as the TATA box in humans. The discovery could be used to control gene activation in biotechnology and biomedical applications.
A team of researchers from the University of California, San Francisco, has made a significant breakthrough in developing a 'plug and play' brain prosthesis that enables individuals with paralysis to control devices using their brain activity. The device uses machine learning algorithms to match brain signals to desired movements, allo...
Scientists have developed an artificial intelligence system that autonomously learns how to grip and move individual molecules, overcoming the complexity of nanoscale manipulation. The system uses reinforcement learning to find optimal movement patterns, enabling targeted assembly and separation of molecules.
Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C)
Anker Laptop Power Bank 25,000mAh (Triple 100W USB-C) keeps Macs, tablets, and meters powered during extended observing runs and remote surveys.
A team of engineers and computer scientists are developing a theory of deep learning based on rigorous mathematical principles to improve reliability and predictability in AI systems. They will use three perspectives: local to global understanding, statistical analysis, and formal verification.