Poster Presentation Hunter Cell Biology Meeting 2025

MitoMimics – Synthetic Microscopy Data for Deep-Learning Segmentation and Tracking of Mitochondrial Dynamic  (#134)

Aidan Mr. Quinn 1 , Volkan Mr. Ozcoban 1 , Sanjeev Mr. Uthishtran 2 , Senthil Dr. Arumugam 2 , Vijay A Prof. Rajagopal 1
  1. Biomedical Engineering, University Of Melbourne, Melbourne, Vic, Australia
  2. Anatomy and Developmental Biology , Monash University, Melbourne, Vic, Australia

Aims: 

The precise quantification of mitochondrial dynamics, morphology and behavior is crucial in understanding their roles in energy production, signaling, metabolism, and disease. Contemporary microscopy techniques can generate hour-long time-lapse movies at high enough temporal and spatial resolutions to reveal even the most intricate details. The current approaches to quantification and tracking for this data rely on an initial algorithmic segmentation step to separate mitochondria from the background. Algorithmic segmentation is prone to error, especially on noisy photo-gentle datasets, with the downstream tracking and analysis suffering as a result. While deep learning methods are widely adopted in imaging fields like electron microscopy and medical imaging, they remain largely underexplored in this domain. This is likely due to the prohibitive time, cost, and expertise required to create training datasets. Our aim is to provide an alternative to the data labelling process, to enable deep learning’s use, without these associated costs. 

Methods: 

We present MitoMimics, a synthetic data generation platform that creates realistic time-lapse microscopy movies, complete with signal and noise simulations along with dynamic mitochondrial behaviors such as fusion, fission, branching, and migration. The fully annotated synthetic datasets enable the training of temporally aware 3D deep-learning segmentation models without requiring any manual annotation of real data. We demonstrate the efficacy of this approach with a novel post-processing pipeline that combines the outputs of two standard 3D segmentation models to achieve both the separation of individual mitochondrion instances and the tracking of these instances through fissions and fusions across the movies. The trained models are tested on real photo-gentle widefield microscopy data to evaluate their performance under challenging imaging conditions. 

Results: 

Our strategy significantly outperformed both algorithmic and simpler machine learning methods, achieving higher segmentation fidelity and more than a ten-fold improvement in instance separation metrics. The dual-model approach effectively distinguished genuine fusion and fission events while demonstrating robust noise handling in photo-gentle widefield datasets. 

Conclusion: 

MitoMimics is a powerful solution to the cost, time, and expertise barriers associated with manual annotation of mitochondrial dynamics data for deep learning-enabled mitochondrial dynamics analysis. By generating realistic synthetic datasets and leveraging temporally-aware 3D segmentation models, our framework enables accurate, long-term mitochondrial tracking in biologically relevant, photo-gentle conditions. This paves the way for more in-depth and precise mitochondrial dynamics analysis and highlights the broader potential of synthetic data-trained deep-learning models for subcellular dynamics research.