SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2019 | Call for Papers

2018 SPIE Optics + Photonics | Register Today



Print PageEmail PageView PDF

Lasers & Sources

Resolving resolution

Achieving accurate results for your spectroscopy application requires a clear understanding of instrument resolution.

From oemagazine July 2003
30 July 2003, SPIE Newsroom. DOI: 10.1117/2.5200307.0008

When you're choosing a grating-based spectrometer, whether one slit or two, one of the first issues you should consider is the spectral resolution of the instrument. The spectral resolution is generally defined as the spectral separation between the two closest peaks that the instrument can resolve. Determining the working resolution of a spectrometer system is not as simple as reading the resolution value off the specification sheet, however.

For a digital detector to resolve two separate peaks, peak signals pixels must be separated by a pixel with lower signal.

For a spectrometer, the bandpass (BP) specifies how much spectral bandwidth is being seen for a given wavelength position. Since this limits the ability of the spectrometer to separate peaks, it is common to refer to the BP as the spectral resolution. To calculate the BP, we require the output image width (Wo) and the reciprocal linear dispersion (RLD). Reciprocal linear dispersion indicates the width of spectrum that is spread over 1-mm at the focal plane. The value is given in nm/mm and is listed as a primary instrument specification which again varies with wavelength. The output image width is easily determined by the input slit width, Wi multiplied by the magnification, M, of the spectrometer. In cases in which M = 1 and Wo = Wi, the bandpass of a two-slit spectrometer, to first-order, is

BP (nm) = RLD (nm/mm) x Wo (mm)

             = RLD (nm/mm) x Wi (mm) x M        [1]

This means that if the slit width is set at 0.1 mm and the monochromator has a linear dispersion of 2.6 nm/mm, the bandpass or spectral resolution will be 0.26 nm. In other words, if two spectral features are 0.26 nm or more apart, we can positively identify them as being separate. We can improve spectral resolution by narrowing the slit, but eventually (at 5 to 25 µm or so), aberrations begin to degrade performance. At this point, narrowing the slit further no longer improves resolution but does cut down the amount of light.

The bandpass for the case in which the slits are as narrow as practical is known as the resolution specification of the instrument, stipulated for a specific wavelength and grating density. Most spectrometers have the resolution specified when the spectrometer is in a scanning-monochromator and single-element-detector configuration. A typical 0.32-m focal length monochromator might have a specified resolution of 0.06 nm with a 1200-g/mm grating and 0.010-mm slits. Increasing the slit width, for example to increase signal-to-noise ratio, will reduce the ability of the spectrometer to resolve peaks, and the spectral resolution will change according to equation 1.

Changing the grating groove density will change the linear dispersion, and hence the resolution, by a ratio of the base groove density to the desired groove density. For example, if the specifications for an instrument assume a 1200-g/mm base groove density and we want to fit it with a 600-g/mm grating instead, we obtain the linear dispersion and resolution for the new configuration by multiplying the specified quantities by a factor of two.

Determining the spectral resolution of a spectrograph and array detector system involves accounting for the interaction of the spectrograph and detector. In spectroscopy, array detectors typically feature pixel widths Wp between 13 µm and 27 µm. Under the specifications for the spectrometer above, the signal reaching the detector could fall onto one pixel. For the detector to resolve two peaks, one pixel between the two peaks must receive a lower signal than its neighbors (see figure). If the peaks fall on the detector so that the pixels with the maximum signal are next to each other, the two peaks will not be resolved by the spectrometer and CCD combination.

To ensure that the spectrometer and CCD system can resolve any spectral features of interest, the best spectral resolution of the system, R, can be calculated by

R = RLD (nm/mm) x Wp (mm/pixel) x 3 pixels         [2]

For a CCD detector with a pixel width of 13 µm and a 0.32-m spectrograph with a 1200-g/mm grating and 0.010-mm-wide input slit, the spectral resolution would be 0.10 nm.

Note that the ultimate resolution of these instruments, spectrometer or spectrograph, relies on illuminating the grating appropriately. When evaluating a spectrometer for your resolution requirements, it is important to consider that your application may require slit widths larger than those used to define the resolution specification. oe

Ray Pini
Ray Pini is senior applications scientist, Optical Spectroscopy Division, at Jobin Yvon Inc., Edison, NJ.