SPIE Startup Challenge 2015 Founding Partner - JENOPTIK Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:

SPIE Photonics West 2017 | Register Today

SPIE Defense + Commercial Sensing 2017 | Call for Papers

Get Down (loaded) - SPIE Journals OPEN ACCESS


Print PageEmail PageView PDF

Optical Design & Engineering

Multiphoton microscopy: imaging more deeply

Life scientists can capture the internal structure of samples with high resolution and minimal damage using adaptive-optics techniques borrowed from astronomy.
5 November 2006, SPIE Newsroom. DOI: 10.1117/2.1200610.0437

In the 16 years since multiphoton microscopy was described in Science magazine by Denk, Strickler, and Webb,1 the method has become established throughout the world. This form of microscopy lets a user build up three-dimensional images of fluorescently labeled samples with a resolution, in theory, of the order of a micron. However, as life scientists look deeper into their samples to answer increasingly complex questions, new methods and technologies have to be developed. Most recently these have involved incorporating active optical elements to correct for sample-induced imaging errors, restoring high resolution even at great depths.

Before describing these advances, it is worth reviewing the powerful features of multiphoton microscopy that have enabled its rapid adoption in life-science laboratories around the world, despite its high capital cost. In practical applications of multiphoton microscopy, the high peak power from an ultra-short-pulse laser is used to generate localized excitation within a sample. Generally the laser excites fluorescent molecules that are either intrinsic to the sample or added as part of a specific labeling protocol.

Of chief importance, the excitation source is in the near-infrared portion of the spectrum. This light both is less phototoxic to the sample and reduces scattering of the excitation light, compared with the blue or UV light normally needed to excite most fluorophores. Because the nonlinear excitation process is inherently localized, the imaging method naturally provides three-dimensional optical sectioning of the sample. All of these features allow the life scientist to image more deeply within living tissue, causing minimal perturbation to it and in many cases enabling true in vivo imaging.

However, in-depth imaging reveals new problems. The optical properties of the samples themselves start to degrade the image. The major cause for this is local changes in the refractive index caused by cell membranes, local fat deposits, and the like. These have recently been measured for a number of samples, demonstrating the complexity of the problem.2 As one penetrates further, each of these small changes contributes to degradation of the point-spread function that describes the blurring of the imaging point.

In confocal microscopy, this degradation leads to a loss of image quality, but in multiphoton imaging the situation is worse. Because the signal depends on the square of the laser power (for two-photon excitation), the fluorescent signals decrease rapidly. This loss of effective excitation intensity, and hence signal, can be partly compensated by increasing the laser power, but this is limited by the available power and by thermal damage to the sample. In addition to the increased risk to the sample as the spot size grows, the system also loses resolution, which cannot be regained so easily.

Such sample-induced aberrations are well known to ground-based astronomers, and we can apply their technology of active optical elements to overcome the problem. The first use of active-element techniques (adaptive optics) for microscopy applications employed spatial light modulators, but since these have a throughput of only 1–2% in the near infrared,3 for most users this is not practical. Instead, we have adopted a deformable-mirror approach.4 Most important, we use a beam-scanning system, since sample scanning is not practical for most biological applications.

The exact correction to be applied with the adaptive optics element must be determined. One approach, following the astronomers, is to measure the wavefront aberration and then apply a correction to overcome the wavefront distortion. We have employed an alternative approach, using algorithms5 that optimize a range of merit parameters that include image brightness and contrast.

The general configuration is shown in Figure 1. In this system the deformable membrane mirror (Oko Technologies) is placed at the reimaged point of the scanning system, meaning that the beam is stationary on the mirror. To distinguish between the light before and after the membrane mirror, a quarter wave plate is used in combination with a polarizing beam splitter. The resulting light is then directed toward the objective lens. For most applications of in-depth imaging, this is either a water-dipping lens (NA ∼1.0, ×40, Nikon, UK) or a long working distance air objective (NA = 0.75, ×20, Nikon, UK).

Figure 1. The schematic illustrates multiphoton imaging with an integrated deformable membrane mirror.

In principle, since the mirror can be updated at around 10kHz, it would be possible to adjust the mirror shape for each pixel of the scan. But for a 512 × 512 image this would require a minimum of 26s to record one optical section. We have therefore evaluated the use of a single correction shape for an entire optical section. The effect on the axial resolution at different depths can be seen in Figure 2.

Figure 2. Applying a single correction (red) across an entire plane of rat brain reverses the degradation (blue) of the axial point-spread function (PSF) with depth almost as well as correcting each individual pixel (green).

Although the adaptive-optics corrections may not return the system to the ultimate performance level, they do provide a practical solution for the user. Indeed, we have shown, for a number of rat-brain sections, that using a single correction shape for a given depth retains around 80% of the ultimate performance of the system. Figure 3 illustrates the improvement in image (in this case, optimized for brightness) 350μm into a rat-brain sample.

Figure 3. The left-hand panel shows unoptimized image, whereas that on the right has been optimized for the brightness of a single pixel. That correction was then applied throughout the sample. Specific points have been circled to highlight improvements.

In summary, it is now possible to achieve almost diffraction-limited performance using multiphoton imaging at around 500μm into intact, living tissue samples. Using a single-point correction across an entire image plane enables imaging at the maximum speed of the optical scan head, if the image is bright enough. It is thus possible to record video-rate images inside living samples with micron resolution. Multiphoton imaging has advanced significantly in the last 16 years. Through the adoption of technologies originally developed for other branches of imaging, further improvements are still being made.

The author would like to acknowledge the support of the Engineering and Physical Sciences Research Council (EPSRC) and funding from the Research Councils UK, Basic Technology Programme.

John Girkin
Institute of Photonics, University of Strathclyde
Glasgow, UK

John Girkin is an associate director at the Institute of Photonics, University of Strathclyde, responsible for applications of photonics technologies, with a specific emphasis on microscopy for the life sciences.