Computational Image Analysis Research
Modern imaging methods enable us to capture biological processes in unprecedented detail and with high temporal and spatial resolution. Often, however, we face the problem to extract and quantify relevant information from the vast amount of image data generated before hypotheses can be tested and new theories and models can be developed. To this end, cutting-edge methods from pattern recognition, machine learning and high performance computing are required to efficiently handle and automatically analyse very large image data sets.
We are a new group working on computational tools to analyze, quantify, and understand biological image data (“Bioimage Bioinformatics”). We are interested in images on all scales, from high-resolution microscopy, to 3D time-lapse imaging of tissues, to movies of swarming animals. We use techniques such as automated segmentation, feature tracking and pattern recognition to reproducibly quantify large multidimensional image datasets.
In our research, we use image analysis for example...
- to clarify the role of forces in multicellular organization and tissue development
- to quantify and compare network-like structures in biological systems
- to measure spatiotemporal dynamics in order to understand signaling and control mechanisms
2018 [ to top ]
Tensile forces drive a reversible fibroblast-to-myofibroblast transition during tissue growth in engineered clefts. in Science Advances (2018). 4(1) eaao4881.
2017 [ to top ]
The Small World of Osteocytes: Connectomics of the Lacuno-Canalicular Network in Bone. in New Journal of Physics (2017). 19 073019.
Spatial heterogeneity in the canalicular density of the osteocyte network in human osteons. in Bone Reports (2017). 6 101-108.
FIJI Macro 3D ART VeSElecT: 3D Automated Reconstruction Tool for Vesicle Structures of Electron Tomograms. in PLOS Computational Biology (2017). 13(1) 1-21.
Coalignment of osteocyte canaliculi and collagen fibers in human osteonal bone. in Journal of Stuctural Biology (2017). 199 177-186.