Proceedings Volume 1673

Integrated Circuit Metrology, Inspection, and Process Control VI

cover
Proceedings Volume 1673

Integrated Circuit Metrology, Inspection, and Process Control VI

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 1 June 1992
Contents: 9 Sessions, 66 Papers, 0 Presentations
Conference: Microlithography '92 1992
Volume Number: 1673

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Linewidth Metrology I: Applications
  • Linewidth Metrology II: Modeling/Simulations
  • Overlay Metrology
  • Phase-Shift Mask Metrology
  • Lithographic Process Monitoring/Metrology I
  • Lithographic Process Monitoring/Metrology II
  • Particles and Defects
  • Special Topics and Emerging Technologies
  • Integrated Circuit Manufacturing System and Technology
  • Linewidth Metrology I: Applications
  • Lithographic Process Monitoring/Metrology II
  • Overlay Metrology
  • Lithographic Process Monitoring/Metrology II
  • Integrated Circuit Manufacturing System and Technology
  • Overlay Metrology
  • Lithographic Process Monitoring/Metrology II
  • Phase-Shift Mask Metrology
  • Integrated Circuit Manufacturing System and Technology
  • Linewidth Metrology I: Applications
  • Lithographic Process Monitoring/Metrology II
  • Particles and Defects
  • Phase-Shift Mask Metrology
Linewidth Metrology I: Applications
icon_mobile_dropdown
Cost analysis and risk assessment for metrology applications
Sudhakar M. Kudva, Randall Potter
Historically, the significance of an accurate and precise metrology tool has been determined by rules of thumb such as the Gagemaker's Rule. However, with the advent of statistical process control of IC manufacturing, it has become practical to statistically determine the probability of a product being misclassified during metrology. Several parameters, such as the process distribution, the precision and accuracy of the metrology tool, the measurement strategies, etc., determine the probability that a good product is classified as bad and vice versa. The probability function can subsequently be converted to a number equivalent to the percentage of product misclassified. From this number, the cost of misclassification can be calculated, which is a function of the precision, the accuracy, and the measurement strategy used. This cost can be used in making decisions involving justification of new metrology capability, better measurement strategies, or to decide whether metrology is needed at all. Examples have been generated to illustrate the actual cost involved in using a poor metrology tool, and strategies have been suggested to contain the cost of misclassification.
Performance of a submicron litho process under CLSM metrology with linearity and discrimination measurements
Mircea V. Dusa, Tony DiBiase, Harris J. Keston
An attempt is made to characterize the submicron litho process under confocal laser scanning microscope metrology, using `linearity' and `discrimination' as process control parameters. This was found to allow for a more detailed process characterization than the usual `CD value' information alone. This also allows one to determine how the CLSM system performs on a particular type of sample and over a specified linewidth range. The measurement substrates included silicon wafers coated with 1.1 micrometers of resist and printed with a gradient of isolated lines ranging from 0.6 micrometers to 1.0 micrometers in 0.05 micrometers increments. The CLSM measurement system was `tuned' to yield optimum precision and accuracy so that measurement linearity would be well established over the range of interest. There are two types of linearities considered here as `linear dependencies:' the measured CD values versus their nominal values and the measured CD sensitivity versus exposure changes for a given focus setting. These linearities are quantified by (1) coefficients of regression (slope), (2) coefficients of determination (R-SQ), and (3) intercept values. Changes in slope values indicate process operating conditions which range from normal to extreme (under/over exposure and positive/negative defocus). R-SQ indicates metrology robustness and the intercept represents the measurement offset over a wide range of operating conditions. There are also two types of capabilities considered here: the CLSM's ability to distinguish (differentiate) 50 nm feature biases over a range of linewidths and the system's ability to differentiate structural changes as a result of focus and exposure changes. Results indicate that a tuned CLSM system can be used to monitor effects of variations in operating conditions as small as 5 mJ/cm2 in exposure energy and 0.4 micrometers defocus.
Submicrometer dimensional measurements with optical microscopy
Stanley S. C. Chim, Gordon S. Kino
We describe here a new approach for measuring focus/exposure dense photoresist line structures with a 1 micrometers spatial period. The algorithm depends on a data clustering technique which allows us to measure resist lines down to 0.3 micrometers . With an advanced calibration procedure, linearity between optical and SEM measurements is achieved down to 0.3 micrometers for nested focus/exposure resist structures with a standard deviation of about 10 nm.
Application of statistical models to decomposition of systematic and random error in low-voltage SEM metrology
Kevin M. Monahan, Sadri Khalessi
Site-to-site LVSEM measurement data on insulating samples are affected in a systematic way by the number of measurements per site. The problem stems from the fact that repeated imaging at the same site does not produce true statistical replicates since the electron dose is cumulative. Indeed, the measurement values tend to grow or shrink in direct proportion to the total dose applied. The data support a model for linewidth as a function of electron dose that includes a linear term for systematic error and a reciprocal square root term as a scaling parameter for random error. We show that charging samples such a resist on oxide, where measurements are dominated by site-to-site variation in the systematic error, should be measured at low electron dose. Conversely, conducting samples such as polysilicon on oxide, where the measurements are dominated by random error, should be measured at relatively high electron dose.
Scanning electron microscope system for linewidth measurement of IC
Cheon Il Eom, TaeBong Eom, Yeong-Uk Ko, et al.
Two methods of measuring the linewidth of IC using a scanning electron microscope (SEM) have been studied. In the first method, the electron beam was digitally scanned by D/A converters and the signal intensity of secondary electrons obtained by an A/D converter was analyzed by image processing technique to determine the linewidth. This method was found to be very simple and fast, but it was necessary to have a standard specimen to calibrate the magnification of the SEM. Moreover, the distortion of the electron optics induced additional errors in the linewidth measurement. In the second method, the electron beam was fixed and the specimen was set on a precise scanning stage driven by a piezoelectric transducer. The linewidth of the specimen has been determined from the signal intensity of the secondary electron and the displacement of the stage measured by a laser interferometer. This method was used to calibrate the linewidth of the standard specimen. For this study, a system which can be used to measure the linewidth by either of the two methods has been developed. The Monte Carlo simulation was also carried out to obtain the intensities of secondary and backscattered electrons. The results of the measurements and the simulation are discussed.
Linewidth Metrology II: Modeling/Simulations
icon_mobile_dropdown
Numerical reference models for optical metrology simulation
Gregory L. Wojcik, John Mould Jr., Egon Marx, et al.
Optical modeling on the computer can aid R&D efforts to enhance metrology methods, and similarly for lithography, alignment, and particulate monitoring. However, full exploitation of optical modeling is hindered by the lack of appropriate benchmarks for verifying algorithms and evaluating approximations. To help remedy this situation we describe a preliminary set of scalar, 2-D numerical reference models (NRMs). These include isolated thin and thick lines, periodic lines, and an isolated trench. Scattered fields are compared for three different solution methods, based on time-domain finite elements, boundary integrals, and a waveguide model. Correlation is good in general, although important differences are seen in both code accuracy and performance. NRM generalizations are suggested that accommodate 3-D effects, imaging, and experimental verification.
Pattern recognition algorithms for linewidth measurement
Hamid K. Aghajan, Charles D. Schaper, Thomas Kailath
Novel edge detection and line fitting pattern recognition algorithms are applied for linewidth measurement on images of integrated circuits. The strategy employs a two step procedure. In the first step, a neural network is used for edge detection of the image. Three neural network approaches are investigated: bootstrap linear threshold, self-organizing, and constrained maximization strategies. These neural networks combine filtering and thresholding to reduce noise and aberrations in the image. Further, the parameters of the neural network are estimated using an unsupervised learning procedure. The advantage of this learning strategy is the ability to adapt to the imaging environment. Consequently, these proposed neural network approaches for edge detection do not require an a priori database of images with known linewidths for calibration. In the second step, new line-fitting methods are applied to the edge maps defined by the neural network to compute linewidth. Two methods are investigated: an eigenvector strategy and a technique that is based on a reformulation of the line-fitting problem such that advanced signal processing techniques can be employed. The latter algorithm is capable of fitting multiple lines in an image which need not be parallel and possesses computational speed superiority over conventional techniques and can be implemented on-line. By employing this two-step strategy, the entire image is used to estimate linewidth as opposed to a single or few line scans. Thus, edge roughness effects can be spatially averaged to obtain an optimal estimate of linewidth. The techniques are general and can be used on images form a variety of microscopes including optical and electron-beam. The pattern recognition algorithms are applied to images of patterned wafers with lines smaller than 1 micrometers wide. These images are obtained by optical microscopes. The estimated linewidths are shown to be in close agreement with those measured by scanning electron microscopes. The application of the proposed pattern recognition techniques to solve other problems in IC metrology, such as rotational wafer alignment, is also discussed.
Progress in optical imaging theory for trenches and lines
Ching-Hua Chou, Gordon S. Kino
We have been developing imaging theory for the confocal microscope and for interferometric microscopes. We are particularly interested in developing techniques for improving the accuracy of measurements of the width of trenches and lines in arrays of photoresist lines and trenches. We are also interested in using the theory to help us interpret phase measurements of phase shift masks with our interferometric microscope, the correlation microscope, as described in an accompanying paper. The aim is to get down to the smallest critical dimensions possible, in the 0.3 micrometers range, and to eliminate the discontinuities sometimes seen in measurements of trenches and lines; these discontinuities are due to resonances of the optical waves in the structure. Since a wide number of well characterized samples is not usually obtainable, this theory is extremely useful for understanding how the various features of a structure affect the form of cloud plots and linescans. A basic aim is to use the theoretical calculations to train pattern recognition algorithms and to determine how the shape and size of trenches and lines influence the form of the linescans and cloud plots we observe. It is particularly important to understand whether pattern recognition algorithms, which make use of the whole cloud plot rather than thresholding on a single linescan, are only using the information from the top of a trench or line. We need to determine whether linescans at all levels are, in fact, influenced by the size at the top and bottom of a trench or line. In this paper, we show that this is indeed the case, and that the theory is fairly adequate at the present time to give a fairly good representation of experimental results.
Model for electron-beam metrology algorithm
Dorron D. Levy, Larry Hendler
High accuracy, versatile metrology is essential for achieving quality and yield in the microelectronic industry. In order to measure line width accurately, a model must be created that will equate the output waveform to the surface feature measured. This model requires an understanding of the interaction of the e-beam with the sample, and the cause and effect relationship between surface topography and waveform. In this paper we propose a simplified mechanism based on the surface response of a sample with a finite e-beam. The waveform produced is the result of the convolution of the e-beam with this responsive surface. We show how surface topography affects waveform structure, and then show how knowledge of the structure translates into accurate line width measurements. We introduce a class of secondary electrons called SE4 electrons, whose source is the interaction of BSE from the surface with other portions of the line being measured. Finally, we discuss how charging phenomena and detector efficiency can be integrated into the model to make it more compatible with actual working conditions.
Simulation of linearity in optical microscopes
Different optical metrology systems such as broadband confocal microscopes, broadband coherence probe microscopes, and broadband brightfield microscopes show different linearity characteristics for different layers and linewidths. Linearity of response is dependent not only on the layer specifics but on the optical system parameters as well. These include the type of microscope, the bandwidth of illumination, the numerical aperture, the partial coherence, etc. Algorithm parameters such as focus- offset, threshold, and phase filter strength and placement (coherence probe) also make a difference. Since computer simulation is now able to predict (with good accuracy) the linewidths measured by these technologies and parameters, it is natural to begin a systematic study of the theoretical predictions. For example, the effects of wavelength on linearity for four types of microscopes are shown. All show improved linearity in the shorter wavelength region, extending the linear range of the instruments. The simulations suggest that variation of wavelength is a key to optimizing linearity around a given feature size. Linearity optimization analysis is performed for several microscope types and measurement algorithms for a simple layer geometry. The optimization program varies threshold and focus offset to achieve the best linearity. The combination of simulation with linearity optimization provides a testbench for metrology system design and evaluation. A new complex-difference algorithm is presented which clearly shows better linearity in at least one simulated case (when resist sidewall angles are changing) than the algorithms commonly used in optical metrology.
Overlay Metrology
icon_mobile_dropdown
Application of mark diagnostics to overlay metrology
Norman H. Goodwin, Alexander Starikov, Grant Robertson
A suit of novel diagnostics and culling software for the BioRad Quaestor optical metrology system has been developed and used to improve the overlay (O/L) metrology. It is based on verification of the a priori constraints in images of the target and bullet levels of the O/L measurement marks. The software makes use of the base Quaestor edge detection and matching algorithms to produce two centerlines per level per axis mark detection, rather than one. This enables computation of measures of redundancy or symmetry in images of O/L measurement marks. Such measures of uncertainty of centerline estimation, rather than the values of measured O/L, are the basis for data culling against the user- defined tolerances. Having passed the culling, average centerline is used in the estimate of O/L. The new Quaestor software is shown to result in significant reduction of uncertainty of resulting measurements. Metrology available with the new software is compared to conventional and illustrated with examples. Based on the diagnostics generated by the tool, modifications of O/L mark design, placement, and process of mark formation are pursued with the objective of reducing the net uncertainty of O/L measurements.
Tool and mark design factors that influence optical overlay measurement errors
Patrick M. Troccolo, Nigel P. Smith, Tara Zantow
The measurement of overlay error by optical instruments is subject to systematic error, sometimes known as tool induced shift, or TIS. Some investigations into the relationship between the mark structure and the properties of the instrument have been conducted, but the effects are still poorly understood. In this paper we report on experiments designed to investigate further the relationship between the mark structure, the alignment of the instrument and the resultant TIS error in the measurements. The optical images of wafers manufactured with overlay marks of known offsets and step heights look the same and enable us to distinguish between geometrical and optical effects. By varying the alignment of various optical elements in the tool, the resultant TIS data allows determination of the most critical instrument parameters which must be controlled before accurate measurements can be made.
Overlay measurements using the scanning electron microscope: accuracy and precision
Michael G. Rosenfield
Accurate and precise overlay metrology is essential for the successful fabrication of integrated circuits with submicron critical dimensions. We have continued our investigation into the use of the scanning electron microscope (SEM) for overlay metrology at the device level and for calibration of optics-based systems. Uncoated, multiple level, test structures, fabricated using electron-beam lithography, were measured with the SEM at 0.9 and 20 kV, using off-axis and symmetrical electron detector arrangements. These overlay measurements were then compared to measurements made on the same structures, at 20 kV, after deposition of a thin conducting film. After calibration of the SEM magnification, the average agreement between the uncoated structures and the coated `standard' was better than 5 nm using the symmetrical electron detector. Three (sigma) measurement precision was estimated to be better than 20 nm at 0.9 kV in the absence of charging. Average tool induced shift was approximately -3 nm. The SEM was also used to measure the poly gate to recessed oxide overlay, after etch, of experimental 0.2 - 0.25 micrometers gate width CMOS devices. A comparison to optical measurements, made at the resist level on large marks at the corners of the chips, showed an average difference of 30 - 40 nm in most cases. The experimental results outlined in this paper strongly suggest that the SEM can be used to make accurate overlay measurements of actual devices and may be useful for calibration of optics-based overlay tools.
Novel approach to placement accuracy analysis
Ken'ichi Kawakami
This paper presents a new mathematical method to analyze placement errors. Using several Hilbert space principles, we analyze placement errors by decomposing them into basic distortion patterns. The second section discusses overlay error in general and mathematically treats mask distortion. Distortion data is represented as a linear combination of the basic distortion pattern. For typical distortions, these coefficients are shown in section 3. An example of analysis is given in section 4 and the discussions are summarized in section 5.
Phase-Shift Mask Metrology
icon_mobile_dropdown
Metrology on phase-shift masks
In the evaluation of new manufacturing processes, metrology is a key function, beginning with the first step of process development through the final step of everyday mass production at the fabrication floor level. RIM-type phase shift masks are expected to be the first application of phase shift masks in high volume production, since they provide improved lithography process capability at the expense of only moderate complexity in their manufacturing. Measurements of critical dimension (CD) and pattern position (overlay) on experimental rim-type and chromeless phase shift masks are reported. Pattern placement (registration) was measured using the Leitz LMS 2000. The overall design and important components were already described. The pattern placement of the RIM type phase shift structures on the photomask described above was determined within a tolerance of 25 nm (3s); nominal accuracy was within 45 nm (3s). On the chromeless phase shift mask the measurement results were easily obtained using a wafer intensity algorithm available with the system. The measurement uncertainties were less than 25 nm and 50 nm for precision and nominal accuracy respectively. The measurement results from the Leitz CD 200 using transmitted light were: a CD- distribution of 135 nm (3s) on a typical 6 micrometers structure all over the mask; the 0.9 micrometers RIM structure had a distribution of 43 nm (3s). Typical long term precision performance values for the CD 200 on both chrome and phase shift structures have been less than 15 nm.
Single-level electric testsites for phase-shifting masks
The phase shifting mask technology has quickly progressed from the exploratory phase to a serious development phase. This requires high resolution measurement techniques to quantify experimental results to optimize the designs. This paper describes a set of electrical linewidth measurement testsites which covers all five representative lithographic features in combination of dark-field and light-field patterns, positive and negative resists. The testsites can investigate binary intensity mask, attenuated, alternating, subresolution-assisted, rim, unattenuated, edge, and covered edge phase shifting masks. All testsites can be used with a single-level wafer exposure. There is no need to remove extra shorts or opens induced by uncovered phase shifters.
Novel architecture for high-speed dual-image generation of pattern data for phase-shifting reticle inspection
Kunihiro Hosono, Susumu Takeuchi, Yaichiro Watakabe, et al.
The pattern data representing ULSI photolithography layers continues to grow exponentially when viewed at the image plane. Data derivation, verification, conversion, and movement have resulted in significant logistical problems and reticle production bottlenecks even with current device densities and reticle manufacturing technologies. With the advent of phase shifting reticle manufacturing and even more dense ULSI devices, database image generation for reticle defect inspection becomes an even more serious issue. Examination of 64 MBit pattern characteristics show that total figure counts per layer approach 1 billion figures per layer. Phase shifting structures increase figure counts per layer to over 1 billion figures. Defect sensitivities of 0.40 micrometers for chrome defects and 0.30 micrometers for phase shift defects are required for 64 MBit reticle inspection. Single die inspection area exceeds 5000 mm2 and die pixel counts are over 1011 pixels. Current reticle inspection database image generation technology requires ten hours per inspection pass. Data load times exceed one hour and data conversion to the inspection format exceeds ten hours. Total reticle inspection time in the manufacturing environment may approach 40 hours. A novel pattern generator architecture allowing 64 MBit reticle inspection in one hour is proposed. The NPG architecture includes a new data format, an integrated data conversion package, and a high resolution, high speed image generator. NPG data conversion performance is analyzed and 782 million figure 64 MBit data conversions are performed in less than one minute. Resulting file sizes are one million bytes. The NPG data format is shown to allow increased edge placement resolution to support increased inspection sensitivity. A method for simultaneously generating chrome and phase shift images is presented.
Direct phase measurements in phase-shift masks
Amalkumar P. Ghosh, Derek B. Dove
We have made direct phase shift measurements in phase shift masks using a transmission optical interferometer based upon a modification of an optical, laser scanning reflection profilometer. Measurements were carried out at 632.8 nm in transparent samples that consisted of thin films of SiO2 on fused silica substrates and thin films of SiO2 and Al2O3 on fused silica substrates. Measurements were also performed on attenuated phase shift mask blanks. The phase values measured at 632.8 nm were corrected for refractive index and wavelength for 248 nm.
Lithographic Process Monitoring/Metrology I
icon_mobile_dropdown
Latent image exposure monitor using scatterometry
Lisa-Michelle Milner, Kirt C. Hickman, Susan M. Wilson, et al.
We discuss the use of light scattered from a latent image to control photoresist exposure dose and focus conditions which results in improved control of the critical dimension (CD) of the developed photoresist. A laser at a nonexposing wavelength is used to illuminate a latent image grating. The light diffracted from the grating is directly related to the exposure dose and focus and thus to the resultant CD in the developed resist. Modeling has been done using rigorous coupled wave analysis to predict the diffraction from a latent image as a function of the substrate optical properties and the photoactive compound (PAC) concentration distribution inside the photoresist. It is possible to use the model to solve the inverse problem: given the diffraction, to predict the parameters of the latent image and hence the developed pattern. This latent image monitor can be implemented in a stepper to monitor exposure in situ, or prior to development to predict the developed CD of a wafer for early detection of bad devices. Experimentation has been conducted using various photoresists and substrates with excellent agreement between theoretical and experimental results. The technique has been used to characterize a test pattern with a focused spot as small as 36 micrometers in diameter. Using diffracted light from a simulated closed-loop control of exposure dose, CD control was improved by as much as four times for substrates with variations in underlying film thickness, compared to using fixed exposure time. The latent image monitor has also been applied to wafers with rough metal substrates and focus optimization.
Develop end-point detection: a manufacturer's perspective
A. Gary Reid, Kenneth M. Sautter
Develop end-point systems compensate for a wide variety of upstream process variables such as dose, soft bake time/temperature variation, and changes in resist thickness. However, questions have been raised about signal repeatability as a factor in controlling the manufacturing process. This paper examines the long-term reliability of develop end-point signals by both identifying and quantifying: (1) the amount of end-point time fluctuation in an image-reversal manufacturing process and the degree to which it correlates with changes in linewidths; (2) reasons for signal noise or changes in end-point time that do not indicate shifts in linewidths; (3) processing problems that lead to shifts in end-point times; (4) developer flow rate as a critical variable in an ultrasonic all-spray develop process; and (5) efforts at IBM to implement a develop end-point system as a turnkey operation.
Electrical resistance measurements for full-field lens characterization
Elliott Sean Capsuto, Andrew Michael Lowen, Jim Dadashev, et al.
Lens performance parameters have been traditionally described using terms such as resolution, astigmatism, field curvature, proximity effects, and distortion. However, with decreasing geometries, decreasing exposure wavelengths and tighter bandwidths (such as 248 nm excimer laser lithography), lens characterization in terms of depth of focus (DOF) and exposure latitudes for the entire lens becomes more critical. The challenge is to define a focus and exposure setting that allows one to operate in the `common corridor.' `Common corridor' is defined as the resulting focus-exposure process window for the entire field of the lens encompassing all geometries of a specific line size. A modification to the MONO-LITH software package allows this calculation to be done quickly and easily.
Advanced wafer manufacturing control for yield improvement in the ULSI age
Takafumi Yoshida, Nobuaki Hayashi, Louis Denes, et al.
Primarily, this research focuses on a new technical approach to the solution of improving yield in the manufacture of super-flat wafers for ULSI. Secondarily, it introduces current and future concerns relating to the depth of focus issue as well as an overview of the general wafer manufacturing process. Ever-decreasing lithographic linewidths and ever increasing wafer diameter and site size are placing great demands on wafer makers to produce even flatter wafers to achieve the yields necessary for economical device production. Successful next- generation wafer production will rely heavily on proactive quality and manufacturing processes. Indeed, metrology is now being jointly developed to move flatness inspection from a final QC inspection to in-process, quasi-real time, analysis. The theoretical development and implementation of an advanced digital interferometer system into the actual wafer manufacturing process is described. An advanced, workstation-centered, Ethernet LAN interfaced, system is described. The flatness, and change in flatness, with respect to processing and time is accumulated, tracked, and controlled from the beginning of wafer preparation through final mirror polishing. Additionally, the influence of the polishing block (as used in the popular `wax-mount polishing') on overall wafer flatness is investigated. Finally, absolute wafer thickness as an additional process variable is introduced into the analysis.
Integrated circuit critical-dimension optimization through correlation of resist spin speed, substrate reflectance, and scanning electron microscope measurements
Anne M. Kaiser, Robert M. Haney
A procedure is described for predicting the optimum resist thickness to insure a reflectivity minimum at the wavelength of interest regardless of the underlying surface reflectivity. An experiment was performed using polysilicon on oxide wafers to demonstrate that uniform CDs could be patterned for a variety of reflectances.
Lithographic Process Monitoring/Metrology II
icon_mobile_dropdown
Overlay and field-by-field leveling in wafer steppers using an advanced metrology system
Martin A. van den Brink, Judon M. D. Stoeldraijer, Henk F.D. Linders
Steppers suitable for resolutions down to 0.35 micrometers are now available for the development and production of 64 mb DRAMs. At these small feature sizes, the depth of focus approaches one micron and the machine to machine overlay requirements are decreased to below 150 nm. This paper reports on the design and characterization of focus, alignment, and stage metrology in a wafer stepper which can meet the aggressive feature size demands. The characteristics of a new broad band field-by-field leveling sensor system with an enhanced process tolerance compared to conventional systems is described. Furthermore, this leveling system has been combined with a multi-axis interferometer system to support a high throughput global alignment strategy with field-by-field leveling. Wafer stepper performance results are presented including depth of focus improvements and global alignment overlay with and without field-by-field leveling. The dependency of focus on wafer structure- and resist thickness-variation is also shown. The total metrology system makes possible a two point global alignment overlay performance below 30 nm, a focus repeatability of 25 nm, and a tilt repeatability of 2 (mu) rad.
Direct measurement of stepper mark detection accuracy on processed wafers
Paolo Canestrari, Samuele Carrera, Giovanni Rivera
A novel method to evaluate the accuracy of the stepper alignment system on processed substrates has been developed. The technique allows one to measure directly, with a limited number of wafers and high accuracy, just the contribution of the alignment system inaccuracy to final overlay. Applications of the method are under evaluation, especially in the optimization of the alignment systems of steppers. Experimental procedures and algorithms are provided and some examples of experimental results are shown.
Correlation of 150-mm silicon wafer site flatness with stepper performance for deep submicron applications
Howard R. Huff, Joseph C. Vigil, Birol Kuyel, et al.
An experimental study was conducted to correlate wafer site flatness SFQD with stepper performance for half-micron lines and spaces. CD measurements were taken on wafers patterned on both GCA pre-production XLS i-line and SVGL Micrascan-90 DUV steppers as well as focus measurements on the Micrascan-90. Wafer site flatness SFQD less than 0.3 micrometers was observed to be a sufficiently small variable in CD non-uniformities for these initial half-micron stepper applications.
Performance of through-the-lens/off-axis laser alignment systems and alignment algorithms on Nikon wafer steppers
Nigel R. Farrar, Frederik Sporon-Fiedler
New generations of ULSI devices require significant improvements in circuit overlay. The performance limits of a current wafer stepper alignment system have been evaluated by testing alignment target capture on planarized targets and rough substrates. Although new hardware is being developed for alignment under difficult conditions, it has been shown that new algorithm designs can extend the performance of existing systems.
Characterization of a one-layer overlay standard
Helmut Besser
This paper describes the characterization of a specially designed one-layer overlay wafer by taking measurements on an optical metrology system and the scanning electron microscope, as well as measurements at the 0- and 180-degree rotational positions on the optical system. The characterized wafer was then compared with an overlay standard provided by VLSI Standards Inc. Results are in good agreement and within a few nanometers of nominal values. This characterized wafer is now accepted and used as an in-house standard.
Particles and Defects
icon_mobile_dropdown
Advanced in-line process control of defects
M. Michael Slama, Marylyn Hoy Bennett, Peter W. Fletcher
A variety of techniques are currently in use for process monitoring and control of wafer quality during production. In general these techniques fall into two categories: (1) particle monitors that provide fast, simple results but are limited in visibility to many defects, and (2) more advanced systems that monitor a much larger class of potential problems but require significant analysis time and interpretation of results. The first category has traditionally been used for in-line monitoring, while the second category has served primarily for engineering analysis and R&D applications. Recent technical developments, in particular the development of advanced systems based on optical pattern filtering, have begun to blur the distinction between in-line and engineering analysis tools. This paper establishes performance guidelines for in-line process monitoring tools, reviews the current techniques utilized against these guidelines, and discusses the potential applications of these methods to in-line process control. Systems based on these new techniques hold the promise of providing sophisticated analysis capability for in-line process control applications, by offering extremely high throughput (in the range of a few minutes per wafer) and high sensitivity (0.20 micrometers and better), combined with intelligent but fully automated analysis of the data to provide `single-number' reports on production wafers.
Submicron defect detection standard for patterned wafer inspection systems
Daniel V. Grelinger
Automated defect detection equipment have been used extensively for patterned wafer inspection in the semiconductor industry. These systems are used to find a variety of patterning and process defects on silicon wafers, before device completion, so that action may be directed toward eliminating the cause of the defects. The method of detection that each type of inspection system uses varies significantly, as does its performance when inspecting an assortment of patterns and materials. Standard materials to quantify the performance of inspection systems are not available, and as a result, a myriad of pseudo-standards are used to measure and compare performance. The suppliers of patterned wafer inspection systems routinely provide `standards' with which to test and qualify their equipment. Generally, programmed `defects' embedded within generic test patterns are reproduced from a mask onto a silicon wafer using standard deposition, lithographic, and etch processes. These `standard test wafers' are of limited value for the task of quantifying the performance of the inspection equipment for several reasons. The photolithographic techniques that are most often used to produce programmed `defects' on test wafers provide for the construction of only one broad type of `defect.' This one type represents a very small sample of the variety of real defects that the detection system is expected to find during actual wafer inspections. Therefore, it is not possible to quantify the performance of the system relating to the other types of real defects. The limitations of the lithography prevent precise control over the shape and dimensions, (and even the reproduction), of sub-micron `defects.' The patterns and materials used for the `standard' are generally not representative of actual semiconductor product wafers, on which real inspections are to be done. Test patterns are generally single level, and high-contrast, with relatively large geometries. `Defects' are reproduced in the same plane as the test pattern, and at the same thickness as the material in which the pattern is being defined. Whereas the sensitivity performance of the detection system may be suitable on the test pattern, relative performance during inspections on real product wafers with complex multilevel patterns of sub-micron geometries cannot be inferred. An alternate method for producing and qualifying `standards' for patterned wafer defect detection system evaluation that represents a significant enhancement over existing methods has been developed. This method utilizes a focused ion beam, (FIB), to fabricate the `defects' directly onto real production semiconductor wafers. The use of an FIB to place milled and deposited `defects' onto a patterned wafer is a new application of this technology. The capabilities of the FIB provide unique potential to address many of the common problems with existing patterned wafer defect standard generation and application.
Surface defect inspection system with an optical spatial frequency filter for semiconductor patterned wafers
Yoko Miyazaki, Hitoshi Tanaka, Nobuyuki Kosaka, et al.
We previously developed a surface defect inspection system for semiconductor patterned wafers. The system utilizes spatial frequency technology, where regular periodic patterns were eliminated, and only defects are imaged at the image plane by using a spatial frequency filter. The system had a detection sensitivity of 0.8 micrometers and an inspection speed of 30 minutes for the 6 inch wafer. With the advent of 16 MDRAMs having a design rule of 0.5 micrometers , however, higher detection sensitivity of about 0.3 micrometers is required. Thus, we have been developing an advanced inspection system based on the previous achievement to meet the emerging needs. The problem to tackle is the signal-to-noise-ratio (SNR) improvement. To achieve higher SNR on the filtered image, we introduced the following two ideas: (1) angled incidence of the laser on the wafer surface and the avoidance of regular reflection light entering the optical system, and (2) a spatial filter which is composed of two photo-plates facing each other. We have developed an optical imaging system and have experimentally shown that the system images defects down to 0.2 micrometers with adequate SNR with a view field of 1000 X 1000 micrometers .
Standard patterned wafer for performance evaluation of inspection tools
Rivi Sherman, Ehud Tirosh, Gadi Neumann, et al.
The defect detection capability of patterned wafer inspection tools is mainly determined by type and size of defects appearing on the wafer. However, detection capability depends not only on defect type and size but also on pattern density and defect location relative to the pattern. In particular, a dense pattern results in higher false alarm probability. Moreover, detection probability decreases with the distance between the defect and the neighboring pattern. The pattern density and defect location are not taken into account in the design of commonly used test wafers. This paper presents the design of a test-wafer aimed at meeting these essential properties. The wafer presented here, exhibiting variable density and spacing characteristics, has a potential of being used as an industry standard for evaluation of patterned wafer inspection tools.
Special Topics and Emerging Technologies
icon_mobile_dropdown
Recent developments in atomic-force microscopy applicable to integrated circuit metrology
Mark R. Rodgers, Frank D. Yashar
The atomic force microscope (AFM) has become a fairly common tool in IC research centers over the past year. Nondestructive imaging with nanometer resolution in ambient conditions is proving to have a wide range of uses. Furthermore, during this same period important advances have been made in scanning probe microscopy, particularly relating to atomic force sensors. AFM technology now has reached the point where it has practical applicability to IC manufacture as well as research. These new advances include: large sample (full wafer) capability, probe tips capable of measuring sidewalls as steep as 10 degrees from the vertical, frictional force measurements, zoom capability from 100 microns to angstroms, highly accurate lateral dimensioning, and noncontact topography measurements. Nanometer-scale with true three-dimensional measurement capability can now be applied to production IC devices and process hardware. Several of these new capabilities are commercially available. Some uses and applications in the semiconductor field are: high resolution imaging, surface roughness measurement of deposited layers and polishing techniques, defect imaging, phase- shift mask development, grain size measurement, gate integrity, and deposited layer integrity over lines. A variety of results are presented including both standard metrology data acquired with AFMs as well as unusual data that cannot be obtained with other techniques. Features with accurate sidewall rendering are demonstrated which clearly shows that previous AFM limitations of sidewall imaging have been overcome. Photomask characterization including phase-shift masks are presented, and unique data showing frictional force distributions are discussed. The evolution of the AFM has been very rapid, and the current state of the instrument should provide solutions to many of the problems facing IC metrologists.
Residual film detection using UV reflectance difference measurements
Anne M. Kaiser
Detection and removal of residual films is important throughout the semiconductor integrated circuit fabrication process. However, at two steps in the production process, silicide formation and selective Tungsten growth, removal of residual films is critical. Failure to remove residuals prior to processing at these two steps results, at best, in devices with degraded performance and, at worst, in non-functional devices. A method for detecting residual films using ultraviolet reflectance measurements is described and experimental results are presented.
Scanning probe metrology
David A. Grigg, Joseph E. Griffith, G. P. Kochanski, et al.
The design of a scanning probe microscope suitable for metrology applications must include solutions to several problems. Actuator errors can be large because of their nonlinear behavior, but this can be solved by independently monitoring the actuator's motion. The probe must be shaped properly for a given measurement, and it must be characterized to allow interpretation of the measurement. We have studied the effects of interaction forces and probe shape with emphasis on surface roughness measurements.
Expert system for performing measurement system characterization
The considerations which drive an expert system for assisting in measurement system characterization are described. The expert system employs several novel techniques for evaluating the integrity of a characterization analysis by determining the degree to which critical assumptions are satisfied and flagging weak points in the data collection or analysis procedure. The properties of good characterization sampling plans are derived. Methods for formulating reliable characterization studies are described. The paper focuses on short term studies intended for equipment comparisons and calibrations; however, with minor alterations it can be expanded to include longer term stability studies.
Phase-contrast latent image metrology for microlithography
Euisik Yoon, Robert W. Allison Jr., Ronald P. Kovacs, et al.
This paper describes a new technical approach to IC lithography characterization by using phase-contrast latent image metrology. Latent images of exposed, undeveloped photoresist observed by the dielectric discontinuity microscope (DDM) can be used for rapid, accurate, and optimal characterization of microlithography processes. Typically, the latent image can not be observed by standard bright-field microscopy. The DDM provides a phase contrast image, in which any optical path difference is changed to contrast enhancement of the image; therefore, the photochemical transformation of photoresist due to exposure (optical thickness change) can be easily observed. A distinctive latent image of an I-line stepper has been observed down to 0.6 micrometers feature size, while that of E-beam direct writing down to 0.2 micrometers . The 660 nm viewing wavelength does not damage the photoresist during observation. Line width variation has been measured as a function of exposure energy and shows strong relationship between latent images and fully developed photoresist images. Contrast of a latent image has been compared as a function of exposure energy and defocus for various line/space patterns, respectively. From these experiments, the optimal exposure energy and depth of focus (DOF) can be decided along with corresponding development procedure. This is critical for rapid and accurate submicron lithography optimization because latent image calibration can eliminate secondary effects resulting from post exposure development processing and can also exclude laborious and time-consuming SEM inspection which are routinely performed in a typical lithography calibration. Digital signal processing software has been implemented in the DDM video images with on-line CD measurement capability. Measuring an average contrast on a specific window region along with other image process functions allows fully quantitative evaluation of the microlithography.
Integrated Circuit Manufacturing System and Technology
icon_mobile_dropdown
Microautomation of semiconductor fabrication
Ilene J. Busch-Vishniac
Micro-automation is defined as the automatic control of processes that require that relative motions be achieved with micron or submicron accuracy. This article discusses micro- automation of mechanical processes in fabrication of semiconductor devices. We identify a few applications of micro-automation, elaborate on the general system requirements, and present a specific realization of a micro-automation system which uses optical sensing and magnetic actuation.
Use of computer optimization programs for the enhancement of Nikon stepper throughput with defectivity reduction benefits
Yumiko Takamori, Christof G. Krautschik
It is known that the wafer shot maps generated by the Nikon stepper software are not truly optimum, and if one can understand the uniqueness of the Nikon software, more die may be placed on the wafer. We have developed two software applications: WAFER OPT finds an optimum wafer shot layout, and GRAPH DRAW draws an existing wafer shot map generated by the Nikon stepper software for comparison to the optimized version. A `cleaner' shot map, with fewer clusters yet significantly more die on the wafer, can be achieved after assessing the validity of changing an existing wafer map to an optimum one. Hence, the two computer applications can result in greatly enhanced stepper throughput and reduced defectivity from the partial clusters often necessary in a nonoptimum wafer shot map. Given a set of parameters such as exposure field dimensions, die layout within a reticle, partial cluster criterion, wafer size, wafer flat length, wafer edge exclusion width, and wafer identification character size, the WAFER OPT program mathematically finds a shot map having the maximum number of complete die using the minimum number of shots and having the least sensitivity to map placement errors of the stepper stage. In addition, WAFER OPT calculates x and y map offsets, uniquely defined in the Nikon software, that can be directly entered in the stepper data file. Another software application called GRAPH DRAW can be used to draw an existing wafer shot map for comparison to the optimized version. A case that resulted in a significant increase in the number of die per wafer and a decrease in both the number of exposure fields and the number of partial die on the wafer is presented.
Lithographic chip identification: meeting the failure analysis challenge
Lynn Perkins, Kevin G. Riddell, Warren W. Flack
This paper describes a novel method using stepper photolithography to uniquely identify individual chips for permanent traceability. A commercially available 1X stepper is used to mark chips with an identifier or `serial number' which can be encoded with relevant information for the integrated circuit manufacturer. The permanent identification of individual chips can improve current methods of quality control, failure analysis, and inventory control. The need for this technology is escalating as manufacturers seek to provide six sigma quality control for their products and trace fabrication problems to their source. This need is especially acute for parts that fail after packaging and are returned to the manufacturer for analysis. Using this novel approach, failure analysis data can be tied back to a particular batch, wafer, or even a position within a wafer. Process control can be enhanced by identifying the root cause of chip failures. Chip identification also addresses manufacturers concerns with increasing incidences of chip theft. Since chips currently carry no identification other than the manufacturer's name and part number, recovery efforts are hampered by the inability to determine the sales history of a specific packaged chip. A definitive identifier or serial number for each chip would address this concern. The results of chip identification (patent pending) are easily viewed through a low power microscope. Batch number, wafer number, exposure step, and chip location within the exposure step can be recorded, as can dates and other items of interest. An explanation of the chip identification procedure and processing requirements are described. Experimental testing and results are presented, and potential applications are discussed.
Automated process control for plasma etching
Margaret McGeown, Khalil I. Arshak, Eamonn Murphy
This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.
Design, performance validation, and reliability testing of a new photochemical dispense pump
William F. Bowers, Stephen Hunt, Ben Lee, et al.
The continued reduction in device linewidths and film thicknesses has led to the need for tighter control of the photochemical dispense process. Accurate and repeatable application of thin films of photoresist is complicated by the need for point-of-use filtration as close to the dispense nozzle as possible. This paper describes the design, validation, and reliability testing of a new photochemical pump whose primary requirements were cleanliness and repeatability. Both the dispense rate and the dispense volume were to be unaffected by changes in temperature, fluid viscosity, filter loading, or air in the filter. The Wafergard GEN-2TM photochemical dispense system is a stepper-motor driven, diaphragm-dispense pump which provides point-of-use filtration to reduce contamination (gels, microbubbles, and particles) and provide precise and repeatable dispense of photochemicals. The pump is a two-stage system in which a 0.1 micrometers stacked disk TeflonTM filter is isolated from the dispense chamber, thus allowing the filtration rate to be uncoupled from the dispense rate. The chemical flowpath is all-Teflon. The dispense diaphragm is hydraulically coupled through a metal bellows to a zero-backlash stepper linear actuator. These design features make the dispense rate, profile, and volume independent of the filter loading. Performance validation testing has been done. Long term (greater than 100,000 cycles) testing using 30 cps positive photoresist with typical operating conditions (2 mL dispense volume at a 2 mL/sec dispense rate through a ten foot 5/32' I.D. section of tubing) showed total volume repeatability to be within +/- 0.02 grams (3 Std Dev). A new method for quantifying the dispense flowrate profile has been developed and used to record the effect of system compliancy on flow dynamics. Wafer coating performance studies using an SVG 90 Series Resist Processing System addressed uniformity and resist consumption. Extensive reliability testing of GEN-2 has been performed. The pump contains a Teflon diaphragm which drives the photochemical out through the dispense nozzle. Possible permeation through this critical component was thoroughly investigated. Several common photochemical solvents, such as xylene, NMP, ethyl lactate, and PGMEA were tested for compatibility and permeability through the diaphragm. Examination using Graphite Furnace AA identified no cross contamination of ions in the pump at the ppb level. Diaphragm and drive train reliability testing of five pumps is underway. Two pumps have been operated for more than 1,000,000 cycles and one for over 500,000 at normal case volume and pressure conditions, and two have past 275,000 cycles at worst case volume and pressure. Analysis was done of the long-term volume precision, ion contamination of the fluids, and mean-time-before-failure.
Plasma deposition of high-quality silicon dioxide
Tie-Han Wang
This paper presents a new deposition system in PECVD. The reaction principle is described and the experimental results show the properties of deposition film are nearly those produced by thermal-grown SiO2. We had success using a passivation layer and diffusion barrier in the semiconductor device manufacturing process.
Linewidth Metrology I: Applications
icon_mobile_dropdown
Metrology algorithms for machine matching in different CD SEM configurations
Terrence W.O. Reilly
Within semiconductor companies, there may be many different critical dimension (CD) measurement instruments. They could be optical, electrical, confocal laser, or scanning electron microscopes (SEM). Often times, they are not only different configurations, but different brands as well. These variations of type and brand have created the need for a measurement algorithm that has the ability to deliver the same measurement between two instruments. It is possible the development group within a semiconductor company would use a CD SEM with expanded capabilities when compared to the production group's CD SEM. In this case, a measurement algorithm unaffected by the differences in signal outputs from the varying microscope designs would enhance system matching. One would believe that two identical CD systems should produce nearly the same measurement. However, when two totally different systems are compared, only a robust algorithm would give good machine to machine matching of measurements. This paper examines two measurement algorithms using two completely different CD and inspection SEMs. The purpose is to examine the algorithm's ability to deliver good machine to machine matching regardless of how the secondary waveform signal is generated.
Achieving linearity for dense CD measurements with confocal metrology
Torsten R. Kaack, Lynda Clark Hannemann-Mantalas, Timothy R. Piwonka-Corle
Metrology systems for submicron integrated circuit manufacturing are required to simultaneously meet stringent requirements for measurement precision, linearity, and focus- exposure tracking over a broad range of substrates and nominal structure sizes. By meeting these requirements, a system guarantees the user's ability to track changes in a lithography process. Data presented at this conference in 1991 suggested that optical metrology systems may experience resonance phenomena which adversely affect the linearity for linewidth measurements below 0.8 micrometers nominal. During the last year, however, we have been able to overcome this effect for our metrology system. The ConQuestTM 2000 white light, real time confocal metrology system was used to measure linewidths at the bottom of dense critical dimension (CD) structures from 0.35 micrometers to 1.5 micrometers nominal. Data for X and Y versus Z is acquired by scanning the sample along the optical axis of the microscope (Z direction) and acquiring an image at each Z position. The bottom of the line is found by applying a selectable focus algorithm to the data. A digitized video linescan through the measurement structure at this Z position is then analyzed and measurements of the selected edges are made. In our characterizations and evaluations, we have repeatedly seen excellent linearity results. Data is presented for a variety of substrates, including patterned photoresist and etched samples. The results reflect a high degree of precision and excellent correlation to SEM data throughout the measurement range.
Lithographic Process Monitoring/Metrology II
icon_mobile_dropdown
Film thickness measurement of ultrathin film using UV wavelength light
Nobuyuki Kondo, Nariaki Fujiwara, Atsushi Abe
In semiconductor production lines, film-thickness is typically measured utilizing ellipsometry or the microspectroscopic measurement. The microspectroscopic measurement is fast, highly accurate and easy to carry out. However, when the film thickness is less than tens of nanometer this method cannot provide adequate measurements. We changed the wavelength used for measurement from the visible light range to the ultraviolet (UV) range and were able to achieve ultrathin-film measurement. At the same time, we found other useful applications of this research: SiO2-film measurement on the polysilicon layer, evaluation of the degree of crystallization of Si film, and others. For highly accurate measurements of ultrathin film, it is vital that the focus and angle be precisely adjusted. In this paper, we report a UV film- thickness measurement system with an automatic adjustment mechanism for the focus and sample surface angle, and also present examples of estimates using this system.
Overlay Metrology
icon_mobile_dropdown
Overlay process control for 16-Mb DRAM manufacturing
Audrey C. Engelsberg, Debra Leach
The tighter film and image size tolerances required for technologies at 0.5 micrometers and below make the control of overlay a critical process parameter. A modeling approach has been developed that is independent of the metrology tool and flexible enough to control overlay for a mix-and-match photolithography strategy. This paper describes the application of the methodologies developed and implemented to control overlay.
Lithographic Process Monitoring/Metrology II
icon_mobile_dropdown
Laser surface profilometer with subangstrom resolution
David J. Mansur, David W. Voorhes, Geert J. Wijntjes
Single point diamond turning technology has allowed the manufacture of highly aspheric optics that were unimaginable just a few years ago. Testing these optics is typically either expensive (i.e., null lens systems), destructive (contacting stylus profilers), or slow (existing optical profilers). We describe a noncontacting surface profilometer with 5 micrometers lateral spatial resolution and sub-angstrom height resolution. The instrument is configured as a polarizing interferometer that operates at near zero path difference and uses a visible laser diode as a light source. The instrument incorporates an autofocussing mechanism that allows high resolution profilometry over a vertical range of +/- 1 mm. The system bandwidth of 200 kHz allows for very fast profiling (87 cm/sec) without loss of resolution. The sensor head is very compact (6 X 6 X 10 cm) and all the processing electronics are contained on a single PC-compatible card. The instrument's autofocussing feature, high speed capability and compact size make it ideally suited for inprocess and final acceptance testing of steep aspheric optics. We present data that demonstrates our ability to accurately measure the macroscopic and microscopic features of single point diamond turned aspheric optics typically used in high resolution x-ray lithography systems.
Focused ion-beam process monitoring
The recent development of focused ion beam systems with image resolution in the 20 nm regime has made practical a new process monitoring discipline, in-line x-y-z metrology. At any step in the wafer fabrication cycle, it is now possible to rapidly image a top view or cross sectional profile of the exact location on a chip or test structure where monitoring is required. For example, one can examine metal and oxide step coverage, via dimensions, resist profiles, metal grain size, or film quality. Under computer automation, any region on an 8' wafer can be located. A hole several microns deep and wide can be milled at this site, the wafer tilted up to 60 degree(s), and the walls of this hole imaged at magnifications approaching 70,000 times. Any arbitrary sequence of steps may be linked together to define a procedure which could be applied to wafer after wafer. Image information available during such a sequence can be uploaded to a host computer and statistical process control methods applied to the image parameters of interest. In this paper we describe the characteristics of a new focused ion beam system, its hardware and software control, and typical results from the cross sectioning of a 4 Mb DRAM.
Response surface modeling utilizing lithographic process simulation
Bruce W. Smith, Wai-Man Shiao, Richard D. Holscher, et al.
A method of incorporating statistically designed fractional factorial experiments into lithographic process simulation software (PROLITH/2) has been used to determine input factor interrelationships inherent within a lithographic process. Rotatable Box-Behnken designs with three centerpoints were utilized for the experiment. The response surface methodology approach was used to analyze the influence of independent factors on a dependent response, and optimize each process. A `method of steepest ascent' was utilized to produce first-order models, which were verified by lack of fit testing. As optimum operating points were approached, a second-order model was fitted and analyzed. A series of experiments studying the effects of prebake, exposure, post-exposure bake, and development on critical dimension and profile in PROLITH/2 produced response surfaces relating each main factor effect as well as non-linear and interaction effects. Additionally, experiments were conducted studying effects of wavelength, numerical aperture, coherence, feature size, defocus, and flare on aerial image contrast. Process optimization for the target response value as well as process latitude as it relates to all factors simultaneously was then possible through use of the response surface.
Application of a dielectric discontinuity microscope to process development at the Fairchild Research Center of National Semiconductor
Robert W. Allison Jr., Euisik Yoon, James G. Heard, et al.
A new type of microscope (DDM) has been applied to submicron process development at the Fairchild Research Center. This high resolution video microscope produces an image which is the superposition of a dielectric discontinuity (phase contrast) and an absorptive optical image. With this instrument a Sparrow's resolution of 0.08 micrometers has been achieved at magnifications from 1150 times to 18,000 times. VIA and contact clearing have been observed from 0.1 micrometers to 1.4 micrometers at aspect ratios of up to 3:1. CD measurements have been made on both latent images and developed images and the results used to optimize the exposure energy for an I-line stepper. The DDM has also been used to visualize defects which are not visible with conventional microscopy. Both metallic and dielectric contaminant films have been detected and a submicron dielectric sidewall has been visualized on an advanced interconnect system. Material deposited during development using the MIMMI process has been observed. A simplified phase contrast transition theory is presented and applied to the observations.
Measuring thickness of a film deposited onto a multilayer metal surface
Measurement of the thickness of the top layer of a wafer having many film layers using microreflectometry normally requires detailed and rather accurate values of the thicknesses and optical properties of the underlying layers and substrate. In many cases such knowledge is unavailable; for example, the layers may be deposited sequentially in a process that does not allow the thickness measurements at each stage. A process is described to characterize the underlying layers and to determine an effective substrate that allows the accurate calculation of the top layer thickness in subsequent measurements. The procedure has been applied to antireflection coated aluminum, which is used in one stage of semiconductor manufacturing. Silicon dioxide thicknesses of 0.1 to 2.2 micrometers determined using the procedure predict reflectances to within 0.6% of the measured values for wavelengths throughout the entire spectrum of 400 to 750 nm.
Use of scatterometry for resist process control
Kenneth P. Bishop, Lisa-Michelle Milner, S. Sohail H. Naqvi, et al.
The formation of resist lines having submicron critical dimensions (CDs) is a complex multistep process, requiring precise control of each processing step. Optimization of parameters for each processing step may be accomplished through theoretical modeling techniques and/or the use of send-ahead wafers followed by scanning electron microscope measurements. Once the optimum parameters for any process having been selected, (e.g., time duration and temperature for post-exposure bake process), no in-situ CD measurements are made. In this paper we describe the use of scatterometry to provide this essential metrology capability. It involves focusing a laser beam on a periodic grating and predicting the shape of the grating lines from a measurement of the scattered power in the diffraction orders. The inverse prediction of lineshape from a measurement of the scatter power is based on a vector diffraction analysis used in conjunction with photolithography simulation tools to provide an accurate scatter model for latent image gratings. This diffraction technique has previously been applied to looking at latent image grating formation, as exposure is taking place. We have broadened the scope of the application and consider the problem of determination of optimal focus.
Measurement of multilayer film and reflectivity on wafers using ultraviolet-visible microspectrophotometry
Warren Lin, Vincent J. Coates, Bhanwar Singh
An ultraviolet (UV) microspectrophotometer, the NanoSpecTM/AFT 210UV, with a measuring spot size less than 10 microns, can make accurate measurements of SiO2 on polysilicon, and thin SiO2 on aluminum. Various materials are presented for comparison in both the UV and visible range.
Program for enhancing image contrast by optimizing noncritical film thicknesses
Anne M. Kaiser
Abstract not available.
Integrated Circuit Manufacturing System and Technology
icon_mobile_dropdown
Evaluation of `bag-in-bottle` resist dispense system
Heather C. Prutton, Samuel Geraint Evans
Photoresist defectivity is a major source of yield loss in the manufacture of integrated circuits. Contamination may result from the resist, the resist process or coating track components. Much work has been carried out to reduce the causes of defects whilst maintaining the properties of the resist fi1m. Exposure of the resist to air is thought to result in solvent loss and moisture absorption which causes the information of gel slugs and dried resist flakes. These gel slugs and dried resist flakes form defects in the resist film. Traditional containment systems for resist such as glass bottles allow for high exposure to air. The irregular physical properties of resultant contaminants from air exposure may allow random passage through filtration systems and can then be incorporated within the film causing pattern defects to occur. Sealed "Bag-in-Bottle" (BIB) systems considerably reduce the exposure of the enclosed chemical to the environment, and have been demonstrated to have an effect in controlling pattern defectivity as part of a total system improvement3. Potential advantages of such sealed systems include improved package cleanliness, reduced exposure of the resist to air during use and improved resist utilization. This investigation involves a comparison between standard glass bottle and BIB containment for undyed and dyed resists. The analysis techniques used were short loop defectivity monitors, laser scanning of resist-coated wafers and defectivity analysis on actual device at wafer probe yield check. Chemical analysis of ionic contaminants and total resist utilization were also measured for both containment systems.
Overlay Metrology
icon_mobile_dropdown
Lithographic overlay measurement precision and calibration and their effect on pattern registration optimization
Overlay of pattern registration is considered by some to be the most yield critical metrology element monitored in the semiconductor manufacturing process. Over the years, the aggressive demands of competitive chip design have constantly maintained these specifications at the process capability limit. This has driven the lithographer from somewhat simple process control techniques like optically read verniers, to computer automated overlay measurement systems whose outputs are applied to the estimation and correction of full field systematic error sources primarily as modeled wafer and lens pattern distortions. When modeled pattern distortions are used to optimize the lithographic overlay process, the point measurement of registration error is no longer the parameter of interest. Instead the lithographer wishes to measure and minimize the surface modeled pattern distortions such as translation, rotation, and magnification. Yet, often neglected is the fact that estimates of these parameters are influenced by measurement system errors resulting in a loss of precision in the estimate of the distortions and the false introduction of otherwise nonexistent distortions leading to improper determination of the true values for the lens. This paper describes the results of a screening simulation designed to determine the relative effects of measurement system errors on the distortion coefficient estimates produced by a pattern distortion model. The simulation confirms the somewhat obvious result that tool induced shift (TIS) translates directly into the estimate of the offset term of the model. In addition, the simulation indicates that errors in the measurement system pixel scale calibration directly scale all distortion estimates by the same factor. The variance of the measurement system sums with the variance of the stepper and inflates the standard error of the regression as well as the uncertainty of each lens parameter's estimate. Higher order nonlinearities or systematic errors in the response of the registration measurement system do not translate directly into distortion coefficient estimates, rather they also inflate the uncertainty associated with each distortion's estimate. Heuristic analytical considerations are presented which explain the behavior observed in the simulation and are used to demonstrate that these conclusions do apply to the general case.
Misregistration metrology tool matching in a one-megabit production environment
Mark Andrew Merrill, Sun Yong Lee, Young Nam Kim, et al.
Misregistration control in a large DRAM production facility requires careful selection, matching, and control of metrology equipment. Total metrology tool measurement error tolerances in advanced DRAM manufacturing are fast approaching 15 nm and below. The sources of error must be minimized and maintained at acceptable levels in order to accurately and precisely monitor the process. In a large production facility with many metrology tools, an additional source of error must be considered besides single tool precision and accuracy; this is the variation in measurement from tool to tool, or machine matching. This paper presents a method for calibrating multiple metrology tools in a fabrication facility that does not use one tool as a `golden standard,' but rather calibrates each machine for its own induced error. A procedure is then introduced that determines a matching error value for each machine based on a mean deviation from what is considered the correct value determined by all tools in the fabrication. This error is expressed as a mean and precision (3 sigma) value and is used to characterize each tool for its `matching error.' Data are then available so that as new machines are introduced into the fabrication they may be characterized for matching against the current tools without a complete investigation requiring a great amount of downtime for the current tools. Results are shown from an investigation using three KLA 5000 series metrology tools with matching measured over five layers (oxide, poly, nitride, WSi, and metal) using multiple wafers for each layer. A matching error is calculated for each tool for each layer. A method for determining total tool error for all machines in the fabrication is described.
Lithographic Process Monitoring/Metrology II
icon_mobile_dropdown
Optical 3D monitoring VLSI structures
Phase object pattern analysis has shown the possibility of considerably increasing microscope spatial resolution. Experiments have revealed more than ten times resolution enhancement. Some phase object images inside an Airy disk are presented.
Phase-Shift Mask Metrology
icon_mobile_dropdown
Application of atomic-force microscopy to phase-shift masks
In recent years, optical phase shifting masks (PSM) have become of interest for the enhancement of submicron lithographic techniques. Various schemes of PSMs have been published in the literature demonstrating improvement in performance of optical lithography techniques for 0.5 micrometers features and below. Some of these schemes require features on the PSMs that are micron or submicron in size. Monitoring the depth as well as the lateral dimensions of these small features is important in order to meet the dimensional tolerances. In this paper we report the application of an atomic force microscope (AFM) to obtain both quantitative as well as qualitative information about the etched features in a PSM.
Integrated Circuit Manufacturing System and Technology
icon_mobile_dropdown
Elimination of send-ahead wafers in an IC fabrication line
Alexander Lee Martin, Louis Anastos, Christopher P. Ausschnitt, et al.
The use of sendahead wafers to control a lithography sector severely limits the performance of that sector. As a result, the elimination of sendahead wafers is most desirable. Through the creation of a robust resist process, careful metrology and modeling of the jobs being processed, and through the use of statistical process control on key variables, we have achieved a Cp of .98 and a Cpk of .95 on an 800 nanometer linewidth, 200 nanometer overlay lithography technology. These results have been achieved on our 8 inch, GCA Autostep 200 2145 i-line steppers at the same time that sendahead wafers have been eliminated. This paper discusses the work done to stabilize the lithography sector and eliminate sendahead wafers, the introduction of statistical process control in that sector and the effects of that introduction on the quality of lots being processed. This paper also presents long term tool and process stability data.
Linewidth Metrology I: Applications
icon_mobile_dropdown
Critical alignments in plane mirror interferometry
Norman Bobroff
A model of two dimensional in-plane metrology using plane mirror interferometry is developed based on an archetype measuring configuration. The purpose of the model is to define and study the key factors limiting the accuracy of this system. Geometric errors are studied under the assumption that component motions and transducers are linear, but not necessarily in alignment. Abbe offsets depend on higher order terms and are neglected in the first order analysis. This study provides some interesting clarifications of which alignment errors are most important and what is meant by cosine error and alignment error.
Lithographic Process Monitoring/Metrology II
icon_mobile_dropdown
Characterization of thickness and perfection of multilayer IC metallization films via x-ray interference phenomena
Simon Bates, Mike Madden, Tommy Hom
X ray interference, which arises from a coherent interaction between an incident x-ray beam and x rays refracted and reflected from interfaces within a sample, has been used to nondestructively study the film thickness and interface perfection of multilayer IC metallization films. The approach used to measure and analyze the interference patterns is discussed in detail. Results of the analysis are presented on a single Ti layer and an Al/Ti bilayer on Si. Determination of the Ti layer thickness for both samples was found to be self consistently accurate to better than 1%. The layer thickness values are compared and contrasted to the results of RBS analysis on the same samples.
Positive-tone surface imaging: methodologies for analysis and process control
Susan K. Jones, Peter Freeman, Edward K. Pavelchek, et al.
A variety of analytical and process control techniques have been employed during process development activities for a 0.5 micrometers deep UV positive tone surface imaging process. Examples of applications of these methods for identification of primary positive tone surface imaging issues and process optimization for enhancement of ultimate resolution are described. Advantages and limitations for each technique are discussed.
Particles and Defects
icon_mobile_dropdown
Particle generation mechanisms in vacuum processing tools
Thomas T. H. Fu, Marylyn Hoy Bennett, R. Allen Bowling
It is estimated that by the year 1995, as much as ninety percent of the contamination in IC manufacturing will be caused by equipment and processes. Contamination can be in the form of particles, defects, scratches, stains, and so on. All are major concerns for yielding ULSI devices. In order to eliminate process/equipment-induced particles, particle formation/generation must be understood before appropriate action can be taken to meet the contamination-free requirements of the future. A variety of vacuum processing tools were studied, including CVD, PECVD, and plasma etch systems with heat lamps, RF, and remote microwave energy sources. A particle collection and characterization methodology was adopted to analyze the particles generated from the vacuum processing tools. By using SEM and EDS to analyze particles collected from equipment chamber walls, both the particle morphology and composition were discerned. The elemental analyses indicate that the composition of particles varied a great deal depending on the chemical nature of the process, chamber material/process compatibility, and energy source.
Phase-Shift Mask Metrology
icon_mobile_dropdown
Measurement of a phase-shifting mask with the Mirau correlation microscope
Stanley S. C. Chim, Gordon S. Kino
We describe here an interference microscope, the Mirau correlation microscope, for examining a phase-shifting mask used in lithography. The accuracy of the phase measurements obtained was +/- 2 degrees (or equivalently +/- nm in height variations) at 577 nm wavelength, with a transverse resolution of the order of 0.4 micrometers . A transmission phase- shifting mask was examined. The amplitude image showed the locations of the bright (transmitting) and dark (chromium) regions of the mask while the phase shifts introduced by the mask were revealed by the phase image. Mask defects in the chromium regions and phase shifting errors in the transmitting regions could thus be readily identified.