Microscopy has been used as a qualitative measurement for hundreds of years. However, qualitative measurements are very time-consuming and labour-intensive when thousands of three-dimensional (3D) image stacks are investigated to learn cell population statistics. This motivates our research on transitioning from qualitative microscopy yielding low confidence in cell population statistics to quantitative microscopy enabling to achieve high confidence in cell population measurements. One key step towards quantitative microscopy is automated 3D segmentation that separates foreground from background. The challenge in automated 3D segmentation is to evaluate the accuracy, precision and computational efficiency of an automated solution.
This paper addresses the problem of designing and evaluating automated 3D segmentation methods on thousands of 3D stem cell image stacks. The design and evaluation methodology consists of (1) constructing candidate segmentation algorithms according to imaging and geometrical assumptions of an experiment, (2) evaluating segmentation accuracy on sampled 3D image stacks that are determined based on statistics of candidate segmentations and (3) developing Web-based visual verification tools to inspect segmentations of thousands of 3D stacks.
The application context for the 3D segmentation problem comes from investigating the effects of various biomaterial scaffolds on 3D shape of stem cells. It was hypothesized that a scaffold type affects cell morphology and influences cell behaviour. To obtain statistically significant evidence for testing this hypothesis, primary human bone marrow stromal cells (hBMSCs) were cultured on 10 scaffold types. The cells were stained for actin and nucleus yielding 128 460 image frames (on average 125 cells/scaffold × 10 scaffold types × 2 stains × 51 frames/cell) using confocal laser scanning microscopy (CLSM).
We demonstrated the segmentation design and evaluation methodology by applying it to a dataset of 1253 mesenchymal stem cells. The most accurate 3D segmentation algorithm achieved an average precision of 0.82 and an accuracy of 0.84 as measured by the Dice similarity index over manually segmented cells. The index value larger than 0.7 indicates a good spatial overlap between reference and automated segmentation. Based on visual inspection of all cell segmentations, the probability of segmentation success was 85%. The computation time to process 1253 image z-stacks was 42.3 h. The Web-based visualization of all z-stack segmentations including actin and nucleus channels is available at https://isg.nist.gov/deepzoomweb/data (under stem cell-scaffold interaction project). The raw and segmented data are available from https://isg.nist.gov/deepzoomweb/zstackDownload.