Organoids—miniature, lab-grown tissues that mimic the structure and function of organs—are advancing biomedical research. They hold great promise for advancements in personalized transplants, disease modeling for conditions like Alzheimer’s and cancer, and gaining precise insights into the effects of medical drugs.
Researchers from Kyushu University and Nagoya University in Japan have now developed an artificial intelligence (AI) model that can predict the development of organoids early in their growth. This innovative approach is faster and more accurate than human researchers, potentially lowering the costs and improving the efficiency of culturing organoids.
The study, published in Communications Biology, focused on hypothalamic-pituitary organoids. These organoids replicate the function of the pituitary gland, which produces adrenocorticotropic hormone (ACTH), a vital regulator of stress, metabolism, blood pressure, and inflammation. ACTH deficiency can result in severe symptoms like fatigue and anorexia, posing life-threatening risks.
"In our lab, our studies on mice show that transplanting hypothalamic-pituitary organoids has the potential to treat ACTH deficiency in humans,” says Hidetaka Suga, Associate Professor at Nagoya University’s Graduate School of Medicine.
A major challenge in organoid research is ensuring proper development. Organoids, derived from stem cells suspended in liquid, are highly sensitive to environmental conditions, which can result in variability in their growth and quality.
The researchers identified early broad expression of a protein called RAX as an indicator of good development, often leading to strong ACTH secretion in mature organoids.
“We can track development by genetically modifying the organoids to make the RAX protein fluoresce,” explains Suga. “However, organoids intended for clinical use, like transplantation, can’t be genetically modified to fluoresce. So our researchers must judge instead based on what they see with their eyes: a time-consuming and inaccurate process.”
To address this issue, Suga and his team collaborated with Hirohiko Niioka, Professor at Kyushu University’s Data-Driven Innovation Initiative, to train deep-learning models for more accurate predictions.
“Deep-learning models are a type of AI that mimics the way the human brain processes information, allowing them to analyze and categorize large amounts of data by recognizing patterns,” says Niioka.
The team captured fluorescent images (showing RAX protein expression) and bright-field images (standard white-light images) of organoids at 30 days of development. Using the fluorescent images as a reference, they classified 1,500 bright-field images into three quality categories: A (wide RAX expression, high quality), B (medium RAX expression, medium quality), and C (narrow RAX expression, low quality).
Niioka trained two advanced deep-learning models—EfficientNetV2-S and Vision Transformer, developed by Google—for this task using 1,200 bright-field images as the training set. He then combined the models into an ensemble model for enhanced accuracy. Testing the ensemble model with 300 remaining images, the system achieved 70% accuracy in classifying organoids, outperforming human experts, who had an accuracy of less than 60%.
“The deep-learning models outperformed the experts in all respects: in their accuracy, their sensitivity, and in their speed,” notes Niioka.
The researchers then tested the model on organoids without genetic modification for fluorescent RAX proteins. Staining techniques revealed that organoids classified as A (high quality) by the model exhibited high RAX expression at 30 days and later strong ACTH secretion. Conversely, those classified as C (low quality) showed lower RAX and ACTH levels.
“Our model can therefore predict at an early stage of development what the final quality of the organoid will be, based solely on visual appearance,” says Niioka. “As far as we know, this is the first time in the world that deep-learning has been used to predict the future of organoid development.”
Looking ahead, the team aims to enhance the model’s accuracy by training it on a larger dataset. Even at its current accuracy level, the model offers transformative benefits to organoid research.
“We can quickly and easily select high-quality organoids for transplantation and disease modeling, and reduce time and costs by identifying and removing organoids that are developing less well,” concludes Suga. “It’s a game-changer.”
Organoids—miniature, lab-grown tissues that mimic the structure and function of organs—are advancing biomedical research. They hold great promise for advancements in personalized transplants, disease modeling for conditions like Alzheimer’s and cancer, and gaining precise insights into the effects of medical drugs.
Researchers from Kyushu University and Nagoya University in Japan have now developed an artificial intelligence (AI) model that can predict the development of organoids early in their growth. This innovative approach is faster and more accurate than human researchers, potentially lowering the costs and improving the efficiency of culturing organoids.
The study, published in Communications Biology, focused on hypothalamic-pituitary organoids. These organoids replicate the function of the pituitary gland, which produces adrenocorticotropic hormone (ACTH), a vital regulator of stress, metabolism, blood pressure, and inflammation. ACTH deficiency can result in severe symptoms like fatigue and anorexia, posing life-threatening risks.
"In our lab, our studies on mice show that transplanting hypothalamic-pituitary organoids has the potential to treat ACTH deficiency in humans,” says Hidetaka Suga, Associate Professor at Nagoya University’s Graduate School of Medicine.
A major challenge in organoid research is ensuring proper development. Organoids, derived from stem cells suspended in liquid, are highly sensitive to environmental conditions, which can result in variability in their growth and quality.
The researchers identified early broad expression of a protein called RAX as an indicator of good development, often leading to strong ACTH secretion in mature organoids.
“We can track development by genetically modifying the organoids to make the RAX protein fluoresce,” explains Suga. “However, organoids intended for clinical use, like transplantation, can’t be genetically modified to fluoresce. So our researchers must judge instead based on what they see with their eyes: a time-consuming and inaccurate process.”
To address this issue, Suga and his team collaborated with Hirohiko Niioka, Professor at Kyushu University’s Data-Driven Innovation Initiative, to train deep-learning models for more accurate predictions.
“Deep-learning models are a type of AI that mimics the way the human brain processes information, allowing them to analyze and categorize large amounts of data by recognizing patterns,” says Niioka.
The team captured fluorescent images (showing RAX protein expression) and bright-field images (standard white-light images) of organoids at 30 days of development. Using the fluorescent images as a reference, they classified 1,500 bright-field images into three quality categories: A (wide RAX expression, high quality), B (medium RAX expression, medium quality), and C (narrow RAX expression, low quality).
Niioka trained two advanced deep-learning models—EfficientNetV2-S and Vision Transformer, developed by Google—for this task using 1,200 bright-field images as the training set. He then combined the models into an ensemble model for enhanced accuracy. Testing the ensemble model with 300 remaining images, the system achieved 70% accuracy in classifying organoids, outperforming human experts, who had an accuracy of less than 60%.
“The deep-learning models outperformed the experts in all respects: in their accuracy, their sensitivity, and in their speed,” notes Niioka.
The researchers then tested the model on organoids without genetic modification for fluorescent RAX proteins. Staining techniques revealed that organoids classified as A (high quality) by the model exhibited high RAX expression at 30 days and later strong ACTH secretion. Conversely, those classified as C (low quality) showed lower RAX and ACTH levels.
“Our model can therefore predict at an early stage of development what the final quality of the organoid will be, based solely on visual appearance,” says Niioka. “As far as we know, this is the first time in the world that deep-learning has been used to predict the future of organoid development.”
Looking ahead, the team aims to enhance the model’s accuracy by training it on a larger dataset. Even at its current accuracy level, the model offers transformative benefits to organoid research.
“We can quickly and easily select high-quality organoids for transplantation and disease modeling, and reduce time and costs by identifying and removing organoids that are developing less well,” concludes Suga. “It’s a game-changer.”