Share Email Print

Journal of Biomedical Optics • Open Access

Brightness-compensated 3-D optical flow algorithm for monitoring cochlear motion patterns
Author(s): Miriam von Tiedemann; Anders Fridberger; Mats Ulfendahl; Jacques H. R. Boutet de Monvel

Paper Abstract

A method for three-dimensional motion analysis designed for live cell imaging by fluorescence confocal microscopy is described. The approach is based on optical flow computation and takes into account brightness variations in the image scene that are not due to motion, such as photobleaching or fluorescence variations that may reflect changes in cellular physiology. The 3-D optical flow algorithm allowed almost perfect motion estimation on noise-free artificial sequences, and performed with a relative error of <10% on noisy images typical of real experiments. The method was applied to a series of 3-D confocal image stacks from an in vitro preparation of the guinea pig cochlea. The complex motions caused by slow pressure changes in the cochlear compartments were quantified. At the surface of the hearing organ, the largest motion component was the transverse one (normal to the surface), but significant radial and longitudinal displacements were also present. The outer hair cell displayed larger radial motion at their basolateral membrane than at their apical surface. These movements reflect mechanical interactions between different cellular structures, which may be important for communicating sound-evoked vibrations to the sensory cells. A better understanding of these interactions is important for testing realistic models of cochlear mechanics.

Paper Details

Date Published: 1 September 2010
PDF: 8 pages
J. Biomed. Opt. 15(5) 056012 doi: 10.1117/1.3494564
Published in: Journal of Biomedical Optics Volume 15, Issue 5
Show Author Affiliations
Miriam von Tiedemann, Karolinska Institutet (Sweden)
Anders Fridberger, Karolinska Institutet (Sweden)
Mats Ulfendahl, Karolinska Institutet (Sweden)
Jacques H. R. Boutet de Monvel, Institut Pasteur (France)

© SPIE. Terms of Use
Back to Top