SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2018 | Call for Papers




Print PageEmail PageView PDF

Micro/Nano Lithography

For semiconductor manufacture, pattern alignment requires subnanometer uncertainty

Future process specifications will demand improved understanding of overlay measurement uncertainty and better metrology.
26 February 2008, SPIE Newsroom. DOI: 10.1117/2.1200702.1036

Overlay error is misalignment between any of the 20 or more patterns used at different stages of semiconductor integrated circuit manufacture (see Figure 1). For the devices to work properly, overlay should be no more than about 20% of the smallest dimension of any element in the patterns. Overlay data, employed to detect alignment problems and to calculate adjustments to the process tools, requires total measurement uncertainty (TMU) that is much smaller than the process tolerance. Until now optical microscopy has been the method of choice for overlay metrology. But a new approach may be needed for the 22nm processes currently in development, which will require TMU to be less than 0.4nm.

With no reference method for overlay, direct determination of TMU on product wafers is not possible. As a result, overlay metrology is often limited to assessments of measurement consistency, such as precision and tool-induced shift (TIS). TIS is half the measurement change when the sample is rotated by 180°, after allowing for the expected change of sign. Other contributors to TMU include differences in measurements between tools, discrepancies in results from identical adjacent targets,1 and variance between measured data and the best fit to patterning tool models. These effects suggest that the real uncertainty in overlay at any point in the device is greater than 5nm, well above our 0.4nm TMU target. Better understanding is needed to make the right decisions about current and potential future techniques for overlay measurement.

Figure 1. Sample overlay cross-section for a poly-gate stack. SiO2: Silicon dioxide.

Figure 2. A Blossom target. There are 112 crosses, 28 in each quadrant. The target shown has crosses in four different layers, with each layer having 7 crosses (redundancy=7) per quadrant and represented by a different color.

Target design plays a fundamental role in measurement precision and accuracy, but targets are also subject to size restrictions. Today, overlay is measured using relatively large (25×25μm) bar-in-bar or box-in-box (either abbreviated BIB) targets. New designs such as Blossom2 (see Figure 2) provide more information in the same space, improving measurement uncertainty and making the target less susceptible to damage by destructive process steps. The number of elements printed at each layer, or measurement redundancy, can be adjusted to trade measurement precision against space saving (see Figure 3), or to permit direct measurement of double-patterned process steps.

Variations in wafer surface reflectivity limit the accuracy with which overlay can be measured. TMU arising from 0.5% reflectivity changes is 0.5nm with a BIB target. The greater amount of information in a Blossom target reduces it to 0.1nm.3 Scatterometry can also measure overlay. Indeed, results published to date show promising levels of precision and TIS,4 though these are from targets requiring ten times the space used by Blossom.

Figure 3. Blossom measurement precision vs. redundancy. The value at redundancy=0 is the precision for a traditional BIB target.

Figure 4. Measured in-die overlay (black arrows) and model-based overlay predictions (red arrows).

Models allow the use of extremely sparse sampling plans, but the difference between the data and fitted model can exceed the overlay process control budget. Some of these differences arise from reticle fabrication error.5 The remainder occur due to real effects outside of the scope of the models. A new image-based technique uses targets small enough (3×3μm or less) to be placed within the die area of many products.5 Figure 4 shows the measured in-die overlay for a production gate poly wafer together with modeled overlay derived from measurements at the four BIB targets in the scribe lines. The difference between the two sets of data is 20nm (3σ), which is much larger than the required process control budget. Characterizing in-die overlay using actual measurements can improve control where it is needed, in the device itself.

Maintaining semiconductor overlay control below 4nm in the future will require a modified approach. Reticle qualification and in-chip measurements can be used to characterize overlay throughout the printed field. Process control measurements can use high-redundancy Blossom or scatterometry targets for the most critical layers, and multilayer Blossom targets for dual patterned layers and to save space in steps that are less critical. No insurmountable roadblocks have been discovered.

Nigel Smith
Hsinchu, Taiwan
Nigel Smith is the senior engineering director responsible for overlay technology development at Nanometrics.
Lewis Binns
York, UK
Lewis Binns holds a BSc in physics with astrophysics from the University of Birmingham, UK, and an MSc in advanced scientific computation from the University of Liverpool.
Bert Plambeck, Kevin Heidrich
Milpitas, CA
Bert Plambeck is director of marketing for overlay products at Nanometrics and previously worked for several other companies within the semiconductor industry.
Kevin Heidrich is director of new product development at Nanometrics.