Share Email Print

Proceedings Paper

Empirical data validation for model building
Author(s): Aram Kazarian
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k1 requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining questionable measurement points for engineering scrutiny since they may run the risk of incorrectly skewing the model. In addition to purely statistical data curve fitting, another concept also merits investigation, that of using first principle, simulation-based characteristic coherence curves to fit the measured data.

Paper Details

Date Published: 24 March 2008
PDF: 6 pages
Proc. SPIE 6922, Metrology, Inspection, and Process Control for Microlithography XXII, 69221I (24 March 2008); doi: 10.1117/12.773414
Show Author Affiliations
Aram Kazarian, Synopsys, Inc. (United States)

Published in SPIE Proceedings Vol. 6922:
Metrology, Inspection, and Process Control for Microlithography XXII
John A. Allgair; Christopher J. Raymond, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?