Segmenting and tracking cellular populations in phase contrast microscopy

A new system tracks live cells in contact from time-lapse microscopy images and detects mitotic events based on the amount of overlap between cellular regions.
27 December 2011
Joe Chalfoun, Michael Halter, Peter Bajcsy and Mary Brady

New quantitative tools for monitoring cellular behavior have provided insight into human diseases (such as cancer), development, and wound healing. Live imaging of cellular populations—where cells growing in a dish are imaged by microscopy—uniquely provides access to changes in gene expression and phenotype. Changes within the cells, such as shape, proliferation rate, or abundance of certain fluorescent-labeled proteins, can be monitored by following individual cells within the field captured by the microscope. The technology to record live images from cellular populations has been available for some time, but only recently have fully automated approaches for deriving quantitative data from these image sets become practical. The unique automation challenges in cell tracking and image segmentation include the lack of invariant shape and intensity descriptors of cells as each cell goes through a cell cycle (for example, dividing and non-dividing phase). Additionally, there is ambiguity in defining the cell edge when cells are in contact or approaching confluence.

Many existing cell tracking techniques are based on complex probabilistic models. Some of these models include Gaussian probability density functions to characterize tracking criteria;1 random walk models applied to individual cell tracks;2 interacting multiple models filter and level set methods applied to the entire collection of cell tracks;3 and global spatio-temporal data association methods applied to a tree representation of all cell trajectories.4 While these cell tracking techniques produce accurate results for isolated cells, they have many adjustable parameters and might not be suitable for nearly confluent cell cultures. Our approach5 extends the spatio-temporal data association concept of Bise and colleagues4 and incorporates the biological knowledge about changes in the 3D geometry and refractive indices of cells on culture substrate.

We divided the tracking solution into cell segmentation and cell correspondence problems (the latter of which is the problem linking a cell in a given image of a time-lapse sequence with the same cell in the next image of the sequence). Figure 1 shows the overview of our segmentation method, which incorporates biological knowledge about dividing and non-dividing cells to achieve reliable separation of foreground and background pixels, and separate cells in contact. Dividing cells release cellular attachments, depolymerize their cytoskeletal components, and become round during cell division. We map these features into our algorithm to detect object roundness and high-intensity pixels to identify dividing cells. Similarly, non-dividing cells are attached and spread, having dark cytoplasm and slightly brighter nuclei. These features are mapped into the segmentation model that describes the expected characteristics of pixels that reside within a cell. We addressed the cell correspondence problem using temporal sampling, which guarantees spatial overlap of the same cell in two time-consecutive images, and by cell neighborhood analysis. To validate our modeling approach to cell tracking, we used Zernike phase contrast images of fibroblast cells (NIH 3T3).5


Figure 1. Segmentation and cell tracking overview.

We compared the automated segmentation results against manual segmentation provided by experts for a stack of 238 images. We determined that the automated segmentation compared to manual segmentation has an average adjusted rand index6 of 0.96 (where 1 is a perfect match), with a standard deviation of 0.03. The cell tracker achieved over 95% accuracy based on complete-cell-cycle accuracy metric: the number of correctly tracked cells from birth to death obtained by manual and automated tracking. In addition, we compared more biologically relevant parameters, the cell cycle time and intracellular green fluorescent protein intensities, for validating the automated results (see Figure 2). The evaluation was based on 110 cells common to both the manual and automated segmentation. Our results suggest that our automated segmentation and tracking method provides nearly identical quantitative data compared to the manual method. Therefore, the biological conclusions regarding cellular proliferation and gene expression would be the same for both techniques.


Figure 2. Accuracy evaluations of automated segmentation and cell tracking algorithms against manually segmented and tracked cells using three variables of interest to biologists. (A) Histogram of cell cycle times. (B) Fluorescent intensity from a representative cell after division. (C) Average fluorescent intensity over 110 cells versus fraction of cell cycle (percentage of normalized cell cycle time). Dashed line: Manual results. Solid line: Automated results.

In summary, we have developed an automated segmentation and tracking system that allows investigators to follow fluctuations in gene expression in single cells and identify correlations between gene expression levels and cell phenotype. With the increasing reliance of systems biology researchers on optical imaging—and the growing volumes of images generated during each experiment—there is a pressing need for collaborative efforts that leverage sophisticated computational methods and quantitative measurements of cells. The future challenges of automation lie in the robustness of the tracking models applied to multiple cell lines, expanding the tracking models to other cell activities (for example, differentiation) and performance reliability when encountering densely growing cells in 3D. With these new data that describe the dynamic changes in individual cells, we hope to develop predictive models for cell phenotype and gene expression.

This work has been supported by National Institute of Standards and Technology (NIST). We would like to acknowledge the Cell Systems Science Group,7 Biochemical Science Division, at NIST for providing the data, and the team members of the computational science in biological metrology8project at NIST for their invaluable input.


Joe Chalfoun, Michael Halter, Peter Bajcsy, Mary Brady
National Institute of Standards and Technology (NIST)
Gaithersburg, MD

Joe Chalfoun is a research scientist engineer. He received his PhD in mechanical and robotics engineering and is currently working in the medical robotics field, mainly in cell biology: dynamic behavior, microscope automation, real-time tracking, and sub cellular feature analysis.

Michael Halter received his PhD in bioengineering from the University of Washington at Seattle in 2004. Since 2006 he has worked at NIST in the areas quantitative microscopy and quantitative cell biology in the Biochemical Science Division.

Peter Bajcsy received his PhD in electrical and computer engineering from the University of Illinois at Urbana-Champaign in 1997. His area of research is image-based analyses and syntheses using mathematical, statistical, and computational models. He has authored more than 21 papers, 8 book chapters, and about 100 conference papers.

Mary Brady is the manager of the Information Systems Group of the Information Technology Laboratory. The group is focused on developing measurements, standards, and underlying technologies that foster innovation throughout the information life cycle from collection and analysis to sharing and preservation.


References:
1. A. Bahnson, Automated measurement of cell motility and proliferation, BMC Cell Biol. 6, pp. 19, 2005. doi:10.1186/1471-2121-6-19
2. L. Martens, G. Monsieur, C. Ampe, K. Gevaert, J. Vandekerckhove, Cell_motility: a cross-platform, open source application for the study of cell motion paths, BMC Bioinformatics 7, pp. 289, 2006. doi:10.1186/1471-2105-7-289
3. K. Lia, E. D. Miller, M. Chen, T. Kanade, L. E. Weiss, P. G. Campbell, Cell population tracking and lineage construction with spatiotemporal context, Med. Image Anal. 12, pp. 546-566, 2008. doi:10.1016/j.media.2008.06.001
4. R. Bise, Z. Yin, T. Kanade, Reliable cell tracking by global data association, IEEE Int'l Symp. Biomed. Imaging, pp. 1004-1010, 2011. doi:10.1109/ISBI.2011.5872571
5. M. Halter, D. R. Sisan, J. Chalfoun, B. J. Stottrup, A. Cardone, A. A. Dima, A. Tona, A. L. Plant, J. T. Elliott, Cell cycle dependent TN-C promoter activity determined by live cell imaging, Cytometry Part A 79A, pp. 192-202, 2011. doi:10.1002/cyto.a.21028
6. L. Hubert, P. Arabie, Comparing partitions, J. Classification 2, pp. 193-218, 1985. doi:10.1007/BF01908075
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research