Machine learning helps grow artificial organs

July 07, 2020

Researchers from the Moscow Institute of Physics and Technology, Ivannikov Institute for System Programming, and the Harvard Medical School-affiliated Schepens Eye Research Institute have developed a neural network capable of recognizing retinal tissues during the process of their differentiation in a dish. Unlike humans, the algorithm achieves this without the need to modify cells, making the method suitable for growing retinal tissue for developing cell replacement therapies to treat blindness and conducting research into new drugs. The study was published in Frontiers in Cellular Neuroscience.

This would allow to expand the applications of the technology for multiple fields including the drug discovery and development of cell replacement therapies to treat blindness

In multicellular organisms, the cells making up different organs and tissues are not the same. They have distinct functions and properties, acquired in the course of development. They start out the same, as so-called stem cells, which have the potential to become any kind of cell the mature organism incorporates. They then undergo differentiation by producing proteins specific to certain tissues and organs.

The most advanced technique for replicating tissue differentiation in vitro relies on 3D cell aggregates called organoids. The method has already proved effective for studying the development of the retina, the brain, the inner ear, the intestine, the pancreas, and many other tissue types. Since organoid-based differentiation closely mimics natural processes, the resulting tissue is very similar to the one in an actual biological organ.

Some of the stages in cell differentiation toward retina have a stochastic (random) nature, leading to considerable variations in the number of cells with a particular function even between artificial organs in the same batch. The discrepancy is even greater when different cell lines are involved. As a result, it is necessary to have a means of determining which cells have already differentiated at a given point in time. Otherwise, experiments will not be truly replicable, making clinical applications less reliable, too.

To spot differentiated cells, tissue engineers use fluorescent proteins. By inserting the gene responsible for the production of such a protein into the DNA of cells, researchers ensure that it is synthesized and produces a signal once a certain stage in cell development has been reached. While this technique is highly sensitive, specific, and convenient for quantitative assessments, it is not suitable for cells intended for transplantation or hereditary disease modeling.

To address that pitfall, the authors of the recent study in Frontiers in Cellular Neuroscience have proposed an alternative approach based on tissue structure. No reliable and objective criteria for predicting the quality of differentiated cells have been formulated so far. The researchers proposed that the best retinal tissues -- those most suitable for transplantation, drug screening, or disease modeling -- should be selected using neural networks and artificial intelligence.

"One of the main focuses of our lab is applying the methods of bioinformatics, machine learning, and AI to practical tasks in genetics and molecular biology. And this solution, too, is at the interface between sciences. In it, neural networks, which are among the things MIPT traditionally excels at, address a problem important for biomedicine: predicting stem cell differentiation into retina," said study co-author Pavel Volchkov, who heads the Genome Engineering Lab at MIPT.

"The human retina has a very limited capacity for regeneration," the geneticist went on. "This means that any progressive loss of neurons -- for example, in glaucoma -- inevitably leads to complete loss of vision. And there is nothing a physician can recommend, short of getting a head start on learning Braille. Our research takes biomedicine a step closer to creating a cellular therapy for retinal diseases that would not only halt the progression but reverse vision loss."

The team trained a neural network -- that is, a computer algorithm that mimics the way neurons work in the human brain -- to identify the tissues in a developing retina based on photographs made by a conventional light microscope. The researchers first had a number of experts identify the differentiated cells in 1,200 images via an accurate technique that involves the use of a fluorescent reporter. The neural network was trained on 750 images, with another 150 used for validation and 250 for testing predictions. At this last stage, the machine was able to spot differentiated cells with an 84% accuracy, compared with 67% achieved by humans.

"Our findings indicate that the current criteria used for early-stage retinal tissue selection may be subjective. They depend on the expert making the decision. However, we hypothesized that the tissue morphology, its structure, contains clues that enable predicting retinal differentiation, even at very early stages. And unlike a human, the computer program can extract that information!" commented Evgenii Kegeles of the MIPT Laboratory for Orphan Disease Therapy and Schepens Eye Research Institute, U.S.

"This approach does not require images of a very high quality, fluorescent reporters, or dyes, making it relatively easy to implement," the scientist added. "It takes us one step closer to developing cellular therapies for the retinal diseases such as glaucoma and macular degeneration, which today invariably lead to blindness. Besides that, the approach can be transferred not just to other cell lines, but also to other human artificial organs."
The Moscow Institute of Physics and Technology is a leading Russian technical university featured in the top international university rankings. It offers degrees in fundamental and applied physics, mathematics, informatics and computer science, chemistry, biology, and other natural and engineering sciences. MIPT is an advanced scientific center that conducts research into aging and aging-related diseases, applied and fundamental physics, 2D materials, quantum technology, artificial intelligence, genome engineering, Arctic and space exploration.

Moscow Institute of Physics and Technology

Related Neural Networks Articles from Brightsurf:

Deep neural networks show promise for predicting future self-harm based on clinical notes
Medical University of South Carolina researchers report in JMIR Medical Informatics that they have developed deep learning models to predict intentional self-harm based on information in clinical notes.

Researchers develop new model of the brain's real-life neural networks
Researchers at the Cyber-Physical Systems Group at the USC Viterbi School of Engineering, in conjunction with the University of Illinois at Urbana-Champaign, have developed a new model of how information deep in the brain could flow from one network to another and how these neuronal network clusters self-optimize over time.

The brain's memory abilities inspire AI experts in making neural networks less 'forgetful'
Artificial intelligence (AI) experts at the University of Massachusetts Amherst and the Baylor College of Medicine report that they have successfully addressed what they call a ''major, long-standing obstacle to increasing AI capabilities'' by drawing inspiration from a human brain memory mechanism known as ''replay.''

New data processing module makes deep neural networks smarter
Artificial intelligence researchers have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization.

Neural cartography
A new x-ray microscopy technique could help accelerate efforts to map neural circuits and ultimately the brain itself.

Researchers study why neural networks are efficient in their predictions
A study has tested the predictions of a neural network to check whether they coincide with actual results.

Optimizing neural networks on a brain-inspired computer
Neural networks in both biological settings and artificial intelligence distribute computation across their neurons to solve complex tasks.

Teaching physics to neural networks removes 'chaos blindness'
Teaching physics to neural networks enables those networks to better adapt to chaos within their environment.

A clique away from more efficient networks
An old branch of mathematics finds a fertile new field of application.

Unravelling complex brain networks with automated 3D neural mapping
KAIST researchers developed a new algorithm for brain imaging data analysis that enables the precise and quantitative mapping of complex neural circuits onto a standardized 3D reference atlas.

Read More: Neural Networks News and Neural Networks Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to