Show all abstracts
View Session
- Front Matter: Volume 7272
- Keynote Session
- Methods for Today
- Solutions for Tomorrow
- Overlay
- Line Edge/Width Roughness
- SEM I
- Scatterometry I
- SEM II
- Diffraction-Based Overlay
- Mask Metrology
- Inspection
- Process Control
- Scatterometry II
- Reference Metrology
- Poster Session
Front Matter: Volume 7272
Front Matter: Volume 7272
Show abstract
This PDF file contains the front matter associated with SPIE Proceedings Volume 7272, including the Title Page, Copyright information, Table of Contents, and the Conference Committee listing.
Keynote Session
Improving optical measurement accuracy using multi-technique nested uncertainties
Show abstract
This paper compares and contrasts different combinations of scatterfield and scatterometry optical configurations as well
as introduces a new approach to embedding atomic force microscopy (AFM) or other reference metrology results
directly in the uncertainty analysis and library-fitting process to reduce parametric uncertainties. We present both
simulation results and experimental data demonstrating this new method, which is based on the application of a Bayesian
analysis to library-based regression fitting of optical critical dimension (OCD) data. We develop the statistical methods
to implement this approach of nested uncertainty analysis and give several examples, which demonstrate reduced
uncertainties in the final combined measurements. The approach is also demonstrated through a combined reference
metrology application using several independent measurement methods.
The measurement uncertainty challenge of advanced patterning development
Show abstract
The trend of reducing the feature size in ICs requires tightening control of critical dimension (CD) variability for optimal
device performance. This drives a need to be able to accurately characterize the variability in order to have reliable
metrics to drive improvement in development. Variation in CDs can come from various sectors such as mask, OPC,
litho & Etch. Metrology is involved in all sectors and it is important to understand the accuracy limitations in metrology
contributing to CD variability. Inaccuracy of the CD-SEM algorithm arising from profile variations is one example.
Profile variation can result from process and design variation. Total Measurement Uncertainty (TMU) is a metric
dependent on the precision of tool under test (CD-SEM here) and relative accuracy, and can track the accuracy of CD
measurements in the presence of varying profiles. This study explores metrology limitations to capture the design and
process contributions to the CD variation at the post litho step. In this paper lithography scanner focus-exposure matrix
wafer was used to capture the process variation. CD and profile data is taken from varying focus fields. The sample
plan described in this paper also covers the design variation by including nested features and isolated features of various
sizes. Appropriate averaging methodology has been adopted in an attempt to decouple the process and design related
CD variation to TMU. While the tool precision can be suppressed by sufficient averaging, the relative accuracy cannot.
This relative accuracy is affected by the complex CD-SEM probe to sample interactions and sensitivity of CD-SEM
algorithms to different feature profiles. One consequence of this is the average offsets between physical CDs (CDAFM)
and SEM CDs change significantly with the defocus. TMU worsens as the focus range is increased from nominal
focus. This paper explores why this is so and also discusses the challenges for the CD-AFM to accurately measure
complex and varying profiles. There is a discussion of the implications of this study for the production measurement
uncertainty, OPC calibration measurement at process of record conditions, and for process window OPC. Results for
optimizing the CD-SEM algorithm to achieve superior accuracy across both design and process induced variation will
also be presented.
Methods for Today
CD-SEM parameter influence on image resolution and measurement accuracy
Show abstract
To determine scanning electron microscopy (SEM) image resolution, we processed a wafer-based sample that
demonstrates directionally isotropic, yet high frequency, details after Fourier transformation [3],[4],[5]. We developed a
new fully automatable method that outputs the SEM resolution, beam shape, and eccentricity as results. To verify the
influence of further parameters (e.g., scanning conditions, acceleration voltage) on how the resolution influences critical
dimension (CD) measurement accuracy, wafers with test structures of a wide range of nominal sizes of lines and contact
holes were created. For all CD-SEM measurements, we used a calibrated CD-atomic force microscope (CD-AFM) as a
reference [6]. CD-SEM measurements were done on different tool generations with variations in best achievable
resolution.
Experimental SEM resolution results will be shown, including influences of focus and stigmation. Both the wafer sample
for resolution monitoring and a new Fourier-based evaluation method show significant sensitivity to variations in these
parameters. By comparing the resolution results in the X and Y directions, astigmatism can be estimated. Even
stigmation drifts less than normal daily variations can be observed. Differences in accuracy of less than 30nm to 500nm
among different CD-SEM tool generations will be shown, which is revealing when trying to understand the quantitative
influence of the SEM resolution on measurement accuracy.
Role of CDAFM in achieving accurate OPC modeling
Show abstract
Accuracy of patterning strongly impacts profit of IC manufacturing and depends on accuracy of optical proximity correction (OPC) and resolution enhancement technology (RET) models. Despite its importance accuracy of RET and OPC is not known in most cases. Accuracy of CDSEM which is used to build the models is questionable. Sample-to-sample bias variation of CDSEM is exceeding required sub-nanometer uncertainty budget. CDAFM could be used for
reference and bias correction. Use of AFM in several areas of RET and OPC modeling is discussed. Accurate inputs and feedback during development reduce number of learning cycles and improve quality of the models.
Sampling for advanced overlay process control
Show abstract
Overlay metrology and control have been critical for successful advanced microlithography for many years, and are
taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a
small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring
and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and
processing becomes more complex, however, it is necessary to consider higher order models with additional degrees of
freedom for control, fault detection, and disposition. This in turn, requires a higher level of sampling and a careful
consideration of flyer removal. Due to throughput concerns, however, careful consideration is needed to establish a
baseline sampling plan using rigorous statistical methods. This study focuses on establishing a 3x nm node immersion
lithography production-worthy sampling plan for 3rd order modeling, verification of the accuracy, and proof of
robustness of the sampling. In addition we discuss motivation for dynamic sampling for application to higher order
modeling.
Simultaneous overlay and CD measurement for double patterning: scatterometry and RCWA approach
Show abstract
As optical lithography advances to 32 nm technology node and beyond, double patterning technology (DPT)
has emerged as an attractive solution to circumvent the fundamental optical limitations. DPT poses unique demands on
critical dimension (CD) uniformity and overlay control, making the tolerance decrease much faster than the rate at which
critical dimension shrinks. This, in turn, makes metrology even more challenging. In the past, multi-pad diffractionbased
overlay (DBO) using empirical approach has been shown to be an effective approach to measure overlay error
associated with double patterning [1]. In this method, registration errors for double patterning were extracted from
specially designed diffraction targets (three or four pads for each direction); CD variation is assumed negligible within
each group of adjacent pads and not addressed in the measurement. In another paper, encouraging results were reported
with a first attempt at simultaneously extracting overlay and CD parameters using scatterometry [2].
In this work, we apply scatterometry with a rigorous coupled wave analysis (RCWA) approach to characterize
two double-patterning processes: litho-etch-litho-etch (LELE) and litho-freeze-litho-etch (LFLE). The advantage of
performing rigorous modeling is to reduce the number of pads within each measurement target, thus reducing space
requirement and improving throughput, and simultaneously extract CD and overlay information. This method measures
overlay errors and CDs by fitting the optical signals with spectra calculated from a model of the targets. Good
correlation is obtained between the results from this method and that of several reference techniques, including empirical
multi-pad DBO, CD-SEM, and IBO. We also perform total measurement uncertainty (TMU) analysis to evaluate the
overall performance. We demonstrate that scatterometry provides a promising solution to meet the challenging overlay
metrology requirement in DPT.
Reference metrology in a research fab: the NIST clean calibrations thrust
Show abstract
In 2004, the National Institute of Standards and Technology (NIST) commissioned the Advanced Measurement
Laboratory (AML) - a state-of-the-art, five-wing laboratory complex for leading edge NIST research. The NIST
NanoFab - a 1765 m2 (19,000 ft2) clean room with 743 m2 (8000 ft2) of class 100 space - is the anchor of this facility
and an integral component of the new Center for Nanoscale Science and Technology (CNST) at NIST.
Although the CNST/NanoFab is a nanotechnology research facility with a different strategic focus than a current high
volume semiconductor fab, metrology tools still play an important role in the nanofabrication research conducted here.
Some of the metrology tools available to users of the NanoFab include stylus profiling, scanning electron microscopy
(SEM), and atomic force microscopy (AFM).
Since 2001, NIST has collaborated with SEMATECH to implement a reference measurement system (RMS) using
critical dimension atomic force microscopy (CD-AFM). NIST brought metrology expertise to the table and
SEMATECH provided access to leading edge metrology tools in their clean room facility in Austin. Now, in the newly
launched "clean calibrations" thrust at NIST, we are implementing the reference metrology paradigm on several tools in
the CNST/NanoFab. Initially, we have focused on calibration, monitoring, and uncertainty analysis for a three-tool set
consisting of a stylus profiler, an SEM, and an AFM.
Our larger goal is the development of new and supplemental calibrations and standards that will benefit from the
Class 100 environment available in the NanoFab and offering our customers calibration options that do not require
exposing their samples to less clean environments. Toward this end, we have completed a preliminary evaluation of the
performance of these instruments. The results of these evaluations suggest that the achievable uncertainties are
generally consistent with our measurement goals.
Solutions for Tomorrow
Evaluation of a new metrology technique to support the needs of accuracy, precision, speed, and sophistication in near-future lithography
Show abstract
A new metrology technique is being evaluated to address the need for accuracy, precision, speed and sophistication in metrology in near-future lithography. Attention must be paid to these stringent requirements as the current metrology capabilities may not be sufficient to support these near future needs. Sub-nanometer requirements in accuracy and precision along with the demand for increase in sampling triggers the need for such evaluation.
This is a continuation of the work published at SPIE Asia conference, 2008. In this technical presentation the authors would like to continue on reporting the newest results from this evaluation of such technology, a new scatterometry based platform under development at ASML, which has the potential to support the future needs.
Extensive data collection and tests are ongoing for both CD and overlay. Previous data showed overlay performance on production layers [1] that meet 22 nm node requirements. The new data discussed in this presentation is from further investigation on more process robust overlay targets and smaller target designs. Initial
CD evaluation data is also discussed.
MOSAIC: a new wavefront metrology
Show abstract
MOSAIC is a new wavefront metrology that enables complete wavefront characterization from print or aerial
image based measurements. Here we describe MOSAIC and verify its utility with a model-based proof of
principle.
Immersion specific error contribution to overlay control
Show abstract
Ever since the introduction of immersion lithography overlay has been a primary concern. Immersion exposure tools
show an overlay fingerprint that we hope to correct for by introducing correctables per field, i.e. a piece-wise
approximation of the fingerprint but within the correction capabilities of the exposure tool. If this mechanism is to be
used for reducing overlay errors it must be stable over an entire batch. This type of correction requires a substantial
amount of measurements therefore it would be ideal if the fingerprint is also stable over time. These requirements are of
particular importance for double patterning where overlay budgets have been further reduced. Since the variation of the
fingerprint specific to immersion tools creeps directly into the overlay budget, it is important to know how much of the
total overlay error can be attributed to changes in the immersion fingerprint. In this paper we estimate this immersion
specific error but find it to be a very small contributor.
Overlay similarity: a new overlay index for metrology tool and scanner overlay fingerprint methodology
Show abstract
For different CD metrologies like average CD from CD SEM and optical CD (OCD) from scatterometry, CD point-to-point R2 has been well adopted as the CD correlation index. For different overlay metrologies like image-based box-in-box overlay and scatterometry-based overlay, we propose the cosine similarity as the correlation index of overlay. The cosine similarity is a measure of similarity between two vectors of n dimensions by finding the cosine of the angle
between them, often used to compare documents in text mining. It has been widely used in web and document search engines and can be used as the similarity index of overlay tool-to-tool matching and scanner tool-to-tool or day-to-day fingerprint.
In this paper, we demonstrate that the cosine similarity has a very high sensitivity to the overly tool performance. We compared the similarities of three generations (A1, A2, A3) of the overlay tools of venders A and B and found that after target re-training and TIS correction on each tool A1 similarity to A3 can be improved from 0.9837 to 0.9951. Overlay point-to-point matching with A3 vs. A1 can be reduced from 4.8 to 2.1 nm. The tool precision similarities, i.e. tool self best similarity, for A1, A2, A3 and B are 0.9986, 0.9990, 0.9995, and 0.9994 respectively. From this table, we demonstrate that we can use old-generation overlay tool with suitable hardware maintenance, to match to the latest-generation overlay tool.
Tabletop coherent diffractive microscopy with extreme ultraviolet light from high harmonic generation
Show abstract
We demonstrate lensless diffractive microscopy using a tabletop source of extreme ultraviolet (EUV) light from high harmonic generation at 29 nm and 13.5 nm. High harmonic generation has been shown to produce fully spatially coherent EUV light when the conversion process is well phase-matched in a hollow-core waveguide. We use this spatial coherence for two related diffractive imaging techniques which circumvent the need for lossy imaging optics in the EUV region of the spectrum. Holography with a reference beam gives sub-100 nm resolution in short exposure times with fast image retrieval. Application of the Guided Hybrid Input-Output phase retrieval algorithm refines the image resolution to 53 nm with 29 nm light. Initial images using the technologically important 13.5 nm wavelength give 92-nm resolution in a 10-minute exposure. Straightforward extensions of this work should also allow near-wavelength resolution with the 13.5 nm source. Diffractive imaging techniques provide eased alignment and focusing requirements
as compared with zone plate or multilayer mirror imaging systems. The short-pulsed nature of the extreme ultraviolet source will allow pump-probe imaging of materials dynamics with time resolutions down to the pulse duration of the EUV.
Overlay
Overlay metrology for double patterning processes
Show abstract
The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node
and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has
to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also
demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOLTM
[1]. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the
required overlay control [2]. This induces at least a three-fold increase in the number of measurements (2 for double
patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process
compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT
very challenging.
In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique
(SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double
patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The
process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard
imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer
imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs
already discussed [1], we investigate SCOL target designs specific to double patterning processes. The feedback to the
scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We
conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in
double patterning processes.
Implementation of the high order overlay control for mass production of 40nm node logic devices
Show abstract
To satisfy the tight budget of critical dimension, an immersion exposure process is widely applied to critical layers
of the recent advanced devices to accomplish the high performance of resolution. In our 40nm node logic devices,
the overlay accuracy of the critical layers (immersion to immersion) would be required to be less than 15nm
(Mean+3sigma) and the one of the sub-critical layers (dry to immersion) would be required to be less than 20nm
(Mean+3sigma). Furthermore, the overlay accuracy of the critical layers might be less than 10nm (Mean+3sigma) in
the 32nm node logic devices. The method of improving the overlay performance should be investigated for mass
production in the future.
In this report, attaching weight to productivity, we selected the technique of high order process correction with
machine configuration and applied it for 40 nm node production. We evaluated the overlay performance of the
critical layers using 40nm process stack wafer and found that high order grid compensation was effective for
reducing the process impact on the overlay accuracy. Furthermore, about the sub-critical layers, high order grid
compensation was also effective for controlling the tool matching error.
Using intrafield high-order correction to achieve overlay requirement beyond sub-40nm node
Show abstract
Overlay requirements for semiconductor devices are getting more demanding as the design rule shrinks.
According to ITRS expectation[1], on product overlay budget is less than 8nm for the DRAM 40nm
technology node. In order to meet this requirement, all overlay error sources have to be analyzed and
controlled which include systematic, random, even intrafield high order errors. In this paper, we studied the
possibility of achieving <7nm overlay control in mass production by using CPE, Correction Per Exposure
mode, and Intra-field high order correction (i-HOPC). CPE is one of the functions in GridMapper package,
which is a method to apply correction for each exposure to compensate both systematic and random overlay
errors. If the intra-field overlay shows a non-linear fingerprint, e.g. due to either wafer processing or reticle
pattern placement errors, the intra-field High Order Process Correction(iHOPC) provided by ASML can be
used to compensate for this error . We performed the experiments on an immersion tool which has the
GridMapper functionality. In our experiment, the previous layer was exposed on a dry machine. The wet to dry
matching represent a more realistic scanner usage in the fab enviroment. Thus, the results contained the
additional contribution of immersion-to-dry matched machine overlay. Our test result shows that the overlay
can be improved by 70%, and the mean+3sigma of full wafer measurement can achieve near the range of 6 to
5nm. In this paper we also discuss the capability of implementation of CPE in the mass production
environment since CPE requires additional wafer mearurement to create the proper overlay correction.
Polar Correction: new overlay control method for higher-order intra-field error dependent on the wafer coordinates
Show abstract
A new overlay control method called "Polar Correction" has been developed.
In the 3x nm half-pitch generation and beyond, even in the case of using a high-end optical exposure system such as
immersion lithography with NA 1.3 over, the overlay accuracy becomes the most critical issue, and the accuracy below
10nm is indispensable [1]. In view of the severe overlay accuracy required, the shot-to-shot intra-field overlay control
cannot be disregarded in this generation. In particular, the shot-to-shot intra-field overlay error caused by the influence of
evaporation heat has been added in the immersion exposure system. However, it is impossible to correct the shot-to-shot
intra-field overlay error by the conventional overlay control method. Therefore, we have developed the new overlay
control method called Polar Correction for higher-order intra-field error dependent on the wafer coordinates.
In this paper, we explain our new overlay control method for higher-order intra-field error, and show the simulation data
and the experimental data. We believe that Polar Correction corresponds to the generation below 10nm overlay
accuracy.
Effects of plasma spatial profile on conversion efficiency of laser produced plasma sources for EUV lithography
Show abstract
Extreme ultraviolet (EUV) lithography devices that use laser produced plasma (LPP), discharge produced plasma (DPP),
and hybrid devices need to be optimized to achieve sufficient brightness with minimum debris generation to support the
throughput requirements of High-Volume Manufacturing (HVM) lithography exposure tools with long lifetime. Source
performance, debris mitigation, and reflector system are all critical to efficient EUV collection and component lifetime.
Enhanced integrated models are continued to be developed using HEIGHTS computer package to simulate EUV
emission at high power and debris generation and transport in multiple and colliding LPP. A new center for materials
under extreme environments (CMUXE) is established to benchmark HEIGHTS models for various EUV related issues.
The models being developed and enhanced include, for example, new ideas and parameters of multiple laser beams in
different geometrical configurations and with different pre-pulses to maximize EUV production. Recent experimental
and theoretical work show large influence of the hydrodynamic processes on EUV generation. The effect of plasma
hydrodynamics evolution on the EUV radiation generation was analyzed for planar and spherical geometry of a tin target
in LPP devices. The higher efficiency of planar target in comparison to the spherical geometry was explained with better
hydrodynamic containment of the heated plasma. This is not the case if the plasma is slightly overheated. Recent
experimental results of the conversion efficiency (CE) of LPP are in good agreement with HEIGHTS simulation.
Line Edge/Width Roughness
Dark-field optical scatterometry for line-width-roughness metrology
Show abstract
As CMOS transistor critical dimensions (CDs) shrink to 35 nm and below, monitoring and control of line width
roughness (LWR) and line edge roughness (LER) will become increasingly important. We used dark-field twodimensional
beam profile reflectometry at 405 nm wavelength with a 0.9 numerical aperture (NA) objective to measure
the low levels of diffuse scattered light from the roughness on the surfaces of lines in test structures on a wafer created
by ISMI. This wafer contains a variety of amorphous etched gate test structures with a range of CDs from approximately
20 nm to 50 nm. Selected structures were thoroughly characterized for CD, LER and LWR by a critical-dimension
scanning electron microscope (CD-SEM). The integrated diffuse scattered intensities obtained from structures with
different CD and LWR values were compared to LWR as measured by the CD-SEM. The diffuse scattered optical signal
intensity showed, at best, a weak correlation to the CD-SEM measured LWR. However a plot of the diffuse scattered
intensity versus CD-SEM measured CD showed a strong, but nonlinear, correlation. This indicates that the scattering
depends not only on the surface roughness but also on the CD of the line (and presumably other details of the profile).
A CD AFM study of the plasma impact on 193nm photoresist LWR: role of plasma UV and ions
Show abstract
193nm photoresist pattern printed by optical lithography are known to present significant sidewalls roughness, also
called linewidth roughness (LWR) after the lithographic step, that is partially transferred into the underlayers during
plasma etching processes. This study is aimed to identify the factors that impact the photoresist pattern sidewalls
roughness during plasma exposure. Among them, plasma VUV light (110-210nm) is identified as being the main
contributor to LWR decrease during plasma etching processes. Moreover, it was found that the LWR obtained after
plasma exposure is strongly dependent on the surface roughening mechanisms taking place at the top of the resist pattern.
SEM metrology damage in polysilicon line and its impact on LWR evaluation
Show abstract
The importance of line-width roughness (LWR)/line-edge roughness (LER) in relation with leakage current has
made it a critical process parameter, especially when device downsizes to 32nm era. Critical dimension scanning
electron microscope (SEM) is still the most popular tool to characterize LWR in semiconductor manufacturing.
However, true LWR metrology by SEM has been a challenge because of metrology noise and induced damage. Method
of repeating multi-shot or long scanning time [5-8] on the same field of view have been proposed to effectively eliminate
metrology noise caused by SEM tool variation and image processing. By such method, however, line damage
(non-conformal critical dimension enlargement) caused by charging is found and damage level depends on the property
of surface material, e-beam energy, and total scanning time. In this article, the impact of damage on LWR metrology
and optimal metrology condition are studied. Following the proposed method [8], polysilicon lines both with and
without oxide mask are investigated under different e-beam energy and number of shot. LWR metrology noise
decreases along with e-beam energy and saturates at 1400eV while damage level is proportional to beam energy. In
similar way, the larger number of repeating shot, the more effective of noise rejection and the more damage of line will
be at the same time. The damage not only increases metrology noise, but also degrades the LWR, that are verified in
both simulation and experiment. In the final, the optimal conditions of true LWR metrology by SEM for both
polysilicon line with and without oxide mask are proposed.
Process variation monitoring (PVM) by wafer inspection tool as a complementary method to CD-SEM for mapping LER and defect density on production wafers
Show abstract
As design rules shrink, Critical Dimension Uniformity (CDU) and Line Edge Roughness (LER) constitute a higher percentage of the line-width and hence the need to control these parameters increases. Sources of CDU and LER variations include: scanner auto-focus accuracy and stability, lithography stack thickness and composition variations, exposure variations, etc. These process variations in advanced VLSI manufacturing processes, specifically in memory devices where CDU and LER affect cell-to-cell parametric variations, are well known to significantly impact device performance and die yield. Traditionally, measurements of LER are performed by CD-SEM or Optical Critical Dimension (OCD) metrology tools. Typically, these measurements require a relatively long time and cover only a small fraction of the wafer area. In this paper we present the results of a collaborative work of the Process Diagnostic & Control Business Unit of Applied Materials® and Nikon Corporation®, on the implementation of a complementary method to the CD-SEM and OCD tools, to monitor post litho develop CDU and LER on production wafers. The method, referred to as Process Variation Monitoring (PVM), is based on measuring variations in the light reflected from periodic structures, under optimized illumination and collection conditions, and is demonstrated using Applied Materials DUV brightfield (BF) wafer inspection tool. It will be shown that full polarization control in illumination and collection paths of the wafer inspection tool is critical to enable to set an optimized Process Variation Monitoring recipe.
SEM I
Validation of CD-SEM etching residue evaluation technique for MuGFET structures
Show abstract
In the previous study, we reported on the CD measurement of multi gate field effect transistors (MuGFETs) by using CD-SEM. We focused on the etching residue at the fin-gate intersection, which causes gate length variation and affects the device performance. Therefore we proposed a technique to quantify the amount of etching residues from CD-SEM top-down images. The increment of the gate linewidth at the fin sidewall was introduced as the "residue index". In this
study, to validate the residue index measurement technique, experiments were carried out. First, the actual shape of the
etching residue was verified in detail by high-resolution experimental-SEM and STEM cross-sectional imaging techniques. Next, the measurement capability of CD-SEM image was confirmed by comparing with the high-resolution experimental-SEM measurement results. Finally, the proposed technique was applied to the layout dependency
evaluation of the residue index, and it was confirmed that the residue index has enough sensitivity to quantify the systematic residue size variation related to fin A/R. Then, we confirmed the reliability of the proposed technique. The residue index measurement technique is expected to be useful for the evaluation of the gate etching process of the MuGFET.
Sensitivity of SEM width measurements to model assumptions
Show abstract
The most accurate width measurements in a scanning electron microscope (SEM) require raw images to be corrected for
instrumental artifacts. Corrections are based upon a physical model that describes the sample-instrument interaction.
Models differ in their approaches or approximations in the treatment of scattering cross sections, secondary electron (SE)
generation, material properties, scattering at the surface potential barrier, etc. Corrections that use different models
produce different width estimates. We have implemented eight models in the JMONSEL SEM simulator. Two are
phenomenological models based upon fitting measured yield vs. energy curves. Two are based upon a binary scattering
model. Four are variants of a dielectric function approach. These models are compared to each other in pairwise
simulations in which the output of one model is fit to the other by using adjustable parameters similar to those used to fit
measured data. The differences in their edge position parameters is then a measure of how much these models differ with
respect to a width measurement. With electron landing energy, beam width, and other parameters typical of those used in
industrial critical dimension measurements, the models agreed to within ±2.0 nm on silicon and ±2.6 nm on copper in
95% of comparisons.
Accurate electrical prediction of memory array through SEM-based edge-contour extraction using SPICE simulation
Show abstract
The continues transistors scaling efforts, for smaller devices, similar (or larger) drive
current/um and faster devices, increase the challenge to predict and to control the
transistor off-state current. Typically, electrical simulators like SPICE, are using the
design intent (as-drawn GDS data). At more sophisticated cases, the simulators are
fed with the pattern after lithography and etch process simulations. As the importance
of electrical simulation accuracy is increasing and leakage is becoming more
dominant, there is a need to feed these simulators, with more accurate information
extracted from physical on-silicon transistors. Our methodology to predict changes in
device performances due to systematic lithography and etch effects was used in this
paper. In general, the methodology consists on using the OPCCmaxTM for systematic
Edge-Contour-Extraction (ECE) from transistors, taking along the manufacturing and
includes any image distortions like line-end shortening, corner rounding and line-edge
roughness. These measurements are used for SPICE modeling. Possible application of
this new metrology is to provide a-head of time, physical and electrical statistical data
improving time to market. In this work, we applied our methodology to analyze a
small and large array's of 2.14um2 6T-SRAM, manufactured using Tower Standard
Logic for General Purposes Platform. 4 out of the 6 transistors used "U-Shape AA",
known to have higher variability. The predicted electrical performances of the
transistors drive current and leakage current, in terms of nominal values and
variability are presented. We also used the methodology to analyze an entire SRAM
Block array. Study of an isolation leakage and variability are presented.
Scatterometry I
Developing an uncertainty analysis for optical scatterometry
Show abstract
This article describes how an uncertainty analysis may be performed on a scatterometry measurement. A
method is outlined for propagating uncertainties through a least-squares regression. The method includes the
propagation of the measurement noise as well as estimates of systematic effects in the measurement. Since there
may be correlations between the various parameters determined by the measurement, a method is described
for visualizing the uncertainty in the extracted profile. The analysis is performed for a 120 nm pitch grating,
consisting of photoresist lines 120 nm high, 45 nm critical dimension, and 88° side wall angle, measured with a
spectroscopic rotating compensator ellipsometer. The results suggest that, while scatterometry is very precise,
there are a number of sources of systematic errors that limit its absolute accuracy. Addressing those systematic
errors may significantly improve scatterometry measurements in the future.
Effect of line-width roughness on optical scatterometry measurements
Show abstract
Line width roughness (LWR) has been identified as a potential source of uncertainty in scatterometry measurements, and
characterizing its effect is required to improve the method's accuracy and to make measurements traceable. In this work,
we extend previous work by using rigorous coupled wave (RCW) analysis on two-dimensionally periodic structures to
examine the effects of LWR. We compare the results with simpler models relying upon a number of effective medium
approximations. We find that the effective medium approximations yield an approximate order of magnitude indicator of
the effect, but that the quantitative agreement may not be good enough to include in scatterometry models.
Product-driven material characterization for improved scatterometry time-to-solution
Show abstract
This paper discusses a novel methodology of material characterization that directly utilizes the scatterometry targets on
the product wafer to determine the optical properties (n&k) of various constituent materials. Characterization of optical
constants, or dispersions, is one of the first steps of scatterometry metrology implementation. A significant benefit of
this new technique is faster time-to-solution, since neither multiple single-film depositions nor multi-film depositions on
blanket/product wafers are needed, making obsolete a previously required-but very time-consuming-step in the
scatterometry setup. We present the basic elements of this revolutionary method, describe its functionality as currently
implemented, and contrast/compare results obtained by traditional methods of materials characterization with the new
method. The paper covers scatterometry results from key enabling metrology applications, like high-k metal gate (postetch
and post-litho) and Metal 2 level post-etch, to explore the performance of this new material characterization
approach. CDSEM was used to verify the accuracy of scatterometry solutions. Furthermore, Total Measurement
Uncertainty (TMU) analysis assisted in the interpretation of correlation data, and shows that the new technique provides
measurement accuracy results equivalent to, and sometimes better than, traditional extraction techniques.
Manufacturing implementation of scatterometry and other techniques for 300-mm lithography tool controls
Show abstract
Focus and dose control of lithography tools for leading edge semiconductor manufacturing are critical to obtaining
acceptable process yields and device performance. The need for these controls is increasing due to the apparent limitation of optical water immersion lithography at NA values of approximately 1.35 and the need to use the same equipment for 45nm, 32nm, and 22nm node production. There is a rich history of lithographic controls using various techniques described in the literature. These techniques
include (but are not limited to) Phase Grating Focus Monitoring1 (PGFM), optical CD control using optical overlay metrology equipment (OOCD)2,3, and in more recent years optical scatterometry4,5. Some of the techniques, even though they are technically sound, have not been practical to implement in volume manufacturing as controls for various reasons.
This work describes the implementation and performance of two of these techniques (optical scatterometry and OOCD)
in a volume 300mm production facility. Data to be reviewed include:
- General implementation approach.
- Scatterometry dose and focus stability data for 193nm immersion and 248nm dry lithography systems.
- Analysis of the stability of optical scatterometry dose and focus deconvolution coefficients over time for 193nm
immersion and 248nm dry systems.
- Comparison between scatterometry and OOCD techniques for focus monitoring of 248nm dry systems.
The presentation will also describe the practical issues with implementing these techniques as well as describe some possible extensions to enhance the current capabilities being described.
SEM II
Methodologies for evaluating CD-matching of CD-SEM
Show abstract
As CD-SEM's precision is severely controlled by sub-nanometer level, we have to evaluate not only repeatability of
tools but also CD-matching between the individual tools. However, it is not easy to measure the CD-matching precisely
± 0.1nm due to repeatability error, stability change, carryover effect, statistical fluctuation of sampling, etc. varying in
the individual tools. In this work uncertainty of ABBA test is experimentally estimated with self-ABBA test. Sample's
carryover trend that dominates uncertainty of the test can be checked. Mathematical consideration implements precise
calculation of the ABBA test.
Calibration of a scanning electron microscope in the wide range of magnifications for the microscope operation in the integrated circuit production line
Show abstract
We propose a method of calibration of a scanning electron microscope (SEM) in a wide range of magnifications. We
also describe a method of SEM measurements of linear dimensions of relief elements of micro- and nanostructures
without performing a special calibration of SEM magnification, which can lie in a wide range. The methods are based on
the use of the marker of the SEM as a measure of length. In order to do it, the marker has to be certified in special
experiments. We present a method for such certification, based on the use of a test object of trapezoidal profile and large
angles of sidewall inclination. We studied the dependence of the marker characteristics on the SEM working distance
and the nominal marker size declared by the microscope manufacturer. We determined periodicity of performing marker
calibration for the SEM being used. The methods are developed for the calibration of SEMs incorporated into the
integrated circuit production line, whose magnification may vary considerably in the course of operation, depending on
dimensions to be measured.
CD-SEM tool stability and tool-to-tool matching management using image sharpness monitor
Show abstract
As device feature size reduction continues, requirements for Critical Dimension (CD) metrology tools are
becoming stricter. For sub-32 nm node, it is important to establish a CD-SEM tool management system with higher
sensitivity for tool fluctuation and short Turn around Time (TAT). We have developed a new image sharpness
monitoring method, PG monitor. The key feature of this monitoring method is the quantification of tool-induced image
sharpness deterioration. The image sharpness index is calculated by a convolution method of image sharpness
deterioration function caused by SEM optics feature. The sensitivity of this methodology was tested by the alteration of
the beam diameter using astigmatism. PG monitor result can be related to the beam diameter variation that causes CD
variation through image sharpness. PG monitor can detect the slight image sharpness change that cannot be noticed by
engineer's visual check. Furthermore, PG monitor was applied to tool matching and long-term stability monitoring for
multiple tools. As a result, PG monitor was found to have sufficient sensitivity to CD variation in tool matching and
long-term stability assessment. The investigation showed that PG monitor can detect CD variation equivalent to ~ 0.1
nm. The CD-SEM tool management system using PG monitor is effective for CD metrology in production.
Performance verification of resist loss measurement method using top-view CD-SEM images for hyper-NA lithography
Show abstract
In this study, the principle of the resist loss measurement method proposed in our previous paper[1] was verified. The technique proposes the detection of resist loss variation using the pattern top roughness (PTR) index determined by scanning electron microscope images. By measuring resist loss with atomic force microscope, we confirmed that the PTR showed a good correlation with the resist loss and was capable of detecting variations within an accuracy of 20 nm for the evaluated sample. Furthermore, the effect of PTR monitoring on line width control was evaluated by comparing the error in line width control after eliminating undesirable resist loss patterns to that of conventional line width monitoring. The error of line width control was defined as the deviation range in post-etch line widths from post-litho values. Using PTR monitoring, the error in line width control decreased from 10 nm to less than 3 nm, thus confirming
the effectiveness of this method.
Diffraction-Based Overlay
Diffraction-based overlay metrology for double patterning technologies
Show abstract
The extension of optical lithography to 32nm and beyond is made possible by Double Patterning Techniques
(DPT) at critical levels of the process flow. The ease of DPT implementation is hindered by increased significance of
critical dimension uniformity and overlay errors. Diffraction-based overlay (DBO) has shown to be an effective
metrology solution for accurate determination of the overlay errors associated with double patterning [1, 2] processes. In
this paper we will report its use in litho-freeze-litho-etch (LFLE) and spacer double patterning technology (SDPT),
which are pitch splitting solutions that reduce the significance of overlay errors. Since the control of overlay between
various mask/level combinations is critical for fabrication, precise and accurate assessment of errors by advanced
metrology techniques such as spectroscopic diffraction based overlay (DBO) and traditional image-based overlay (IBO)
using advanced target designs will be reported. A comparison between DBO, IBO and CD-SEM measurements will be
reported. . A discussion of TMU requirements for 32nm technology and TMU performance data of LFLE and SDPT
targets by different overlay approaches will be presented.
Through-focus scanning and scatterfield optical methods for advanced overlay target analysis
Show abstract
In this paper we present overlay measurement techniques that use small overlay targets for advanced semiconductor
applications. We employ two different optical methods to measure overlay using modified conventional optical
microscope platforms. They are scatterfield and through-focus scanning optical microscope (TSOM) imaging methods.
In the TSOM method a target is scanned through the focus of an optical microscope, simultaneously acquiring optical
images at different focal positions. The TSOM images are constructed using the through-focus optical images. Overlay
analysis is then performed using the TSOM images. In the scatterfield method, a small aperture is scanned at the
conjugate back focal plane of an optical microscope. This enables angle-resolved scatterometry on a high-magnification
optical platform. We also present evaluation of optical constants using the scatterfield method.
Mask Metrology
Cr migration on 193nm binary photomasks
Show abstract
A new type of chrome-on-glass (COG) photomask defect was observed in 2006. Absorber material migrated into vias on
dark field masks, partially obscuring the incident 193nm light and thereby causing the imaged photoresist to be
underexposed. Through detailed characterization of new and defective photomasks and their histories it was determined
that the migration is not caused by any unusual line events or faulty mask handling procedures. Rather, it is an inevitable
result of mask use under specific conditions. Four essential elements have been identified: the presence of Cr, 193nm
light exposure, charge, and water vapor and their roles elucidated through modeling studies and existing literature. We
have reproduced Cr migration in the laboratory, demonstrating that these four elements are necessary and sufficient for
this type of defect to occur. The only way to avoid Cr migration is to avoid reactions with water vapor.
Compute resource management and TAT control in mask data prep
Show abstract
With each new process technology node chip designs increase in complexity and size, and mask data prep flows require
more compute resources to maintain the desired turn around time (TAT). In addition, to maintaining TAT, mask data
prep centers are trying to lower costs. Securing highly scalable processing for each element of the flow - geometry
processing, resolution enhancements and optical process correction, verification and fracture - has been the focal point
so far towards the goal of lowering TAT. Processing utilization for different flow elements is dependent on the
operation, the data hierarchy and device type. In this paper we pursue the introduction of a dynamic utilization driven
compute resource control system applied to large scale parallel computation environment. The paper will explain the
performance challenges in optimizing a mask data prep flow for TAT and cost while designing a compute resource
system and its framework. In addition, the paper will analyze performance metrics TAT and throughput of a production
system and discuss trade-offs of different parallelization approaches in data processing in interaction with dynamic
resource control. The study focuses on 65nm and 45nm process node.
Investigation of phase distribution using Phame in-die phase measurements
Show abstract
As lithography mask processes move toward 45nm and 32nm node, mask complexity increases steadily, mask specifications tighten and process control becomes extremely important. Driven by this fact the requirements for metrology tools increase as well. Efforts in metrology have been focused on accurately measuring CD linearity and uniformity across the mask, and accurately measuring phase variation on Alternating/Attenuated PSM and transmission for Attenuated PSM.
CD control on photo masks is usually done through the following processes: exposure dose/focus change, resist develop and dry etch. The key requirement is to maintain correct CD linearity and uniformity across the mask. For PSM specifically, the effect of CD uniformity for both Alternating PSM and Attenuated PSM and etch depth for Alternating PSM becomes also important. So far phase measurement has been limited to either measuring large-feature phase using interferometer-based metrology tools or measuring etch depth using AFM and converting etch depth into phase under the assumption that trench profile and optical properties of the layers remain constant. However recent investigations show that the trench profile and optical property of layers impact the phase. This effect is getting larger for smaller CD's. The currently used phase measurement methods run into limitations because they are not able to capture 3D mask effects, diffraction limitations or polarization effects. The new phase metrology system - Phame(R) developed by Carl Zeiss SMS overcomes those limitations and enables laterally resolved phase measurement in any kind of production feature on the mask. The resolution of the system goes down to 120nm half pitch at mask level.
We will report on tool performance data with respect to static and dynamic phase repeatability focusing on Alternating PSM. Furthermore the phase metrology system was used to investigate mask process signatures on Alternating PSM in order to further improve the overall PSM process performance. Especially global loading effects caused by the pattern density and micro loading effects caused by the feature size itself have been evaluated using the capability of measuring phase in the small production features. The results of this study will be reported in this paper.
Image library approach to evaluating parametric uncertainty in metrology of isolated feature width
Show abstract
When measuring the width of an isolated line or space on a wafer or photomask, only the feature's
image is measured, not the object itself. Often the largest contributors to measurement uncertainty are
the uncertainties in the parameters which affect the image. Measurement repeatability is often smaller
than the combined parametric uncertainty.
An isolated feature's edges are far enough away from nearest edges of other features that its image
does not change if this distance is increased (about 10 wavelengths in an optical microscope or
exposure tool, or several effective-beam-widths in a SEM). When the leading and trailing edges of the
same feature are not isolated from each other the metrology process becomes nonlinear. Isolated
features may not be amenable to measurement by grating methods (e.g., scatterometry), and there is no
hard lower limit to how small an isolated feature can be measured. There are several ways to infer the
size of an isolated feature from its image in a microscope (SEM, AFM, optical,...), and they all require
image modeling.
Image modeling accounts for the influence of all of the parameters which can affect the image, and
relates the apparent linewidth (in the image) to the true linewidth (on the object). The values of these
parameters, however, have uncertainties and these uncertainties propagate through the model and lead
to parametric uncertainty in the linewidth measurement, along with the scale factor uncertainty and the
measurement repeatability. The combined measurement uncertainty is required in order to decide if
the result is adequate for its intended purpose and to ascertain if it is consistent with other similar
results.
The parametric uncertainty for optical photomask measurements derived using an edge threshold
approach has been described previously [1]; this paper describes an image library approach to this
issue and shows results for optical photomask metrology over a linewidth and spacewidth range of 10
nm to 4 μm. The principles will be described, the 1-dimensional image library used and the method of
comparing images, along with a simple interpolation method, will be explained, and results presented.
This method is easily extended to any kind of imaging microscope and to p dimensions, where p is the
number of imaging parameters used. It is more general than the edge threshold method and leads to
markedly different results for features smaller than a wavelength.
Inspection
New inspection technology for hole pattern by Fourier space on hp 4x-nm generation
Show abstract
We tried to detect the CD variation of the 4x generation hole pattern using the diffraction light on Fourier space with the
polarized light and the modified illumination.
The new technology named DD (Dual Diffraction) method has been developed based on the optical simulation and the
experimental approaches. We introduce the case of detection for the diameter variation on a multi-layered hole pattern
with new method.
Development of optical simulation tool for defect inspection
Show abstract
Much effort has been done to detect the defects of interest (DOI) by optical inspection systems because the size of the
DOI shrinks according to the design rule of a semiconductor device. Performance of the inspection system is dependent
on complicated optical conditions on illumination and collection systems including wavelength and polarization filter.
Magnitude of defect signal for a given optical condition was estimated using a simulation tool to find a suitable optical
condition and technologies required in the future. This tool, consisting of a near-field calculation using Finite Difference
Time Domain (FDTD) methods and an image formation calculation based on Fourier optics, is applicable not only to
Köhler illumination system but also to confocal system and dark field system. We investigated defect inspection methods
for the 45 nm and the next technology nodes. For inspection of various defects, the system using several wavelengths is
suitable. For inspection of a specific defect, the system with polarization control is suitable. Our calculation suggests that
the defect detection sensitivity for the 1X nm technology node should be increased by more than 10 times compared to
the 45 nm technology node.
Phenomenology of electron-beam-induced photoresist shrinkage trends
Show abstract
For many years, lithographic resolution has been the main obstacle in keeping the pace of transistor densification to meet
Moore's Law. For the 45 nm node and beyond, new lithography techniques are being considered, including immersion
ArF (iArF) lithography and extreme ultraviolet lithography (EUVL). As in the past, these techniques will use new types
of photoresists with the capability to print 45 nm node (and beyond) feature widths and pitches.
In a previous paper [1], we focused on ArF and iArF photoresist shrinkage. We evaluated the magnitude of shrinkage for
both R&D and mature resists as a function of chemical formulation, lithographic sensitivity, scanning electron
microscope (SEM) beam condition, and feature size. Shrinkage results were determined by the well accepted
methodology described in ISMI's CD-SEM Unified Specification [2].
A model for resist shrinkage, while derived elsewhere [3], was presented, that can be used to curve-fit to the shrinkage
data resulting from multiple repeated measurements of resist features. Parameters in the curve-fit allow for metrics
quantifying total shrinkage, shrinkage rate, and initial critical dimension (CD) from before e-beam exposure. The ability
to know this original CD is the most desirable result; in this work, the ability to use extrapolation to solve for a given
original CD value was also experimentally validated by CD-atomic force microscope (AFM) reference metrology.
Historically, many different conflicting shrinkage results have been obtained among the many works generated through
the litho-metrology community. This work, backed up by an exhaustive dataset, will present an explanation that makes
sense of these apparent discrepancies. Past models for resist shrinkage inherently assumed that the photoresist line is
wider than the region of the photoresist to be shrunk [3], or, in other words, the e-beam never penetrates enough to reach
all material in the interior of a feature; consequently, not all photoresist is affected by the shrinkage process. In actuality,
there are two shrinkage regimes, which are dependent on resist feature CD or thickness. Past shrinkage models are true
for larger features. However, our results show that when linewidth becomes less than the eventual penetration depth of
the e-beam after full shrinkage, the apparent shrinkage magnitude decreases while shrinkage speed accelerates. Thus, for
small features, most shrinkage occurs within the first measurement. This is crucial when considering the small features
to be fabricated by immersion lithography.
In this work, the results from the previous paper [1] will be shown to be consistent with numerically simulated results,
thus lending credibility to the postulations in [1].
With these findings, we can conclude with observations about the readiness of SEM metrology for the challenges of both
dry and immersion ArF lithographies as well as estimate the errors involved in calculating the original CD from the
shrinkage trend.
Systematic defect filtering and data analysis methodology for design based metrology
Show abstract
Recently several Design Based Metrologies (DBMs) are introduced and being in use for wafer verification. The
major applications of DBM are OPC accuracy improvement, DFM feed-back through Process Window
Qualification (PWQ) and advanced process control. In general, however, the amount of output data from DBM is
normally so large that it is very hard to handle the data for valuable feed-back. In case of PWQ, more than thousands
of hot spots are detected on a single chip at the edge of process window. So, it takes much time and labor to review
and analyze all the hot spots detected at PWQ. Design-related systematic defects, however, will be found repeatedly
and if they can be classified into groups, it would be possible to save a lot of time for the analysis.
We have demonstrated an EDA tool which can handle the large amount of output data from DBM by reducing
pattern defects to groups. It can classify millions of patterns into less than thousands of pattern groups. It has been
evaluated on the analysis of PWQ of metal layer in NAND Flash memory device and random contact hole patterns
in a DRAM device.
The result shows that this EDA tool can handle the CD measurement data easily and can save us a lot of time and
labor for the analysis. The procedures of systematic defect filtering and data handling using an EDA tool are
presented in detail
Quantitative measurement of voltage contrast in SEM images for in-line resistance inspection of wafers manufactured for SRAM
Show abstract
An in-line inspection method for partial-electrical measurement of defect resistance, which is
quantitatively estimated from the voltage contrast formed in an SEM image of an incomplete-contact
defect, was developed. This inspection method was applied to wafers manufactured for an SRAM
device. That is, the gray scales of the defect images captured on an SRAM plug pattern were
quantitatively analyzed. Accordingly, the gray scales of defective plugs formed for shared contact
patterns were classified as two levels. The higher contrasts, which were calculated from the grayscales
of the darker defects, were about 100%; the lower contrasts, which were calculated from the grayscales of
the other defects, were from 38% to 60%. The resistances of these defects were estimated from a
calibration curve obtained from the grayscales of the SEM images and the resistances of deliberately
formed failures on standard wafers for voltage-contrast estimation. The estimated resistances of the
lower-contrast defects (with an accuracy of about an order of magnitude) agree well with the resistances
measured by nano-prober. It is concluded that this in-line inspection method for partial-electrical
measurement is a useful technique for defect classification based on defect resistance and defect mode.
Study of devices leakage of 45nm node with different SRAM layouts using an advanced e-beam inspection systems
Show abstract
In this study, a nickel silicide (NiSi) wafer and a WCMP wafer were used.
We captured bright voltage contract (BVC) defects at N+/P-well on NiSi wafer, we
also captured N+/P-well leak/short defects on WCMP wafer as BVC defects in
positive mode inspection and dark voltage contrast (DVC) defects in Negative
ModeTM inspection. N+/P-well leakage signatures of the two inspection modes of
WCMP strongly correlate with each other, which indicate they are the same defects.
N+/P-well leakage signature on WCMP wafer also correlate with that on NiSi wafer.
With negative mode inspection, we captured P+/N-well leakage on WCMP wafer at
two different static random access memory (SRAM) arrays (SRAM1 and SRAM3) as
DVC defects. The P+/N-well leakage signature is very different from N+/P-well
leakage signature in SRAM3. P+/N-well leakage signature of SRAM1 is also very
different from that of SRAM3. This study confirmed our prediction that different
SRAM layout will cause different P+/N-well leakage, especially in the case of over
etching of share contact hole.
Process Control
Hotspot monitoring system with contour-based metrology
Show abstract
As design rules shrink, hotspot management is becoming increasingly important. In this paper, an automatic system of
hotspot monitoring that is the final step in the hotspot management flow is proposed. The key technology for the
automatic hotspot monitoring is contour-based metrology. It is an effective method of evaluating complex patterns, such
as hotspots, whose efficiency has been proved in the field of optical proximity correction (OPC) calibration. The
contour-based metrology is utilized in our system as a process control tool available on mass-production lines.
The pattern evaluation methodology has been developed in order to achieve high sensitivity. Lithography simulation
decides a hotspot to be monitored and furthermore indicates the most sensitive points in the field of view (FOV) of a
hotspot image. And quantification of the most sensitive points is consistent with an engineer's visual check of a shape of
a hotspot. Its validity has been demonstrated in process window determination. This system has the potential to
substantially shorten turnaround time (TAT) for hotspot monitoring.
Outliers detection by fuzzy classification method for model building
Show abstract
Optical Proximity Correction (OPC) is used in lithography to increase the achievable resolution and pattern transfer
fidelity for IC manufacturing. Nowadays, immersion lithography scanners are reaching the limits of optical resolution
leading to more and more constraints on OPC models in terms of simulation reliability. The detection of outliers coming
from SEM measurements is key in OPC [1]. Indeed, the model reliability is based in a large part on those measurements
accuracy and reliability as they belong to the set of data used to calibrate the model. Many approaches were developed
for outlier detection by studying the data and their residual errors, using linear or nonlinear regression and standard
deviation as a metric [8].
In this paper, we will present a statistical approach for detection of outlier measurements. This approach consists of
scanning Critical Dimension (CD) measurements by process conditions using a statistical method based on fuzzy CMean
clustering and the used of a covariant distance for checking aberrant values cluster by cluster. We propose to use
the Mahalanobis distance [2] in order to improve the discrimination of the outliers when quantifying the similarity within
each cluster of the data set.
This fuzzy classification method was applied on the SEM CD data collected for the Active layer of a 65 nm half pitch
technology. The measurements were acquired through a process window of 25 (dose, defocus) conditions. We were able
to detect automatically 15 potential outliers in a data distribution as large as 1500 different CD measurement. We will
discuss about these results as well as the advantages and drawbacks of this technique as automatic outliers detection for
large data distribution cleaning.
Monitoring measurement tools: new methods for driving continuous improvements in fleet measurement uncertainty
Show abstract
Ever shrinking measurement uncertainty requirements are difficult to achieve for a typical metrology
toolset, especially over the entire expected life of the fleet. Many times, acceptable performance can be
demonstrated during brief evaluation periods on a tool or two in the fleet. Over time and across the rest of
the fleet, the most demanding processes often have measurement uncertainty concerns that prevent optimal
process control, thereby limiting premium part yield, especially on the most aggressive technology nodes.
Current metrology statistical process control (SPC) monitoring techniques focus on maintaining the
performance of the fleet where toolset control chart limits are derived from a stable time period. These
tools are prevented from measuring product when a statistical deviation is detected. Lastly, these charts
are primarily concerned with daily fluctuations and do not consider the overall measurement uncertainty. It
is possible that the control charts implemented for a given toolset suggest a healthy fleet while many of
these demanding processes continue to suffer measurement uncertainty issues. This is especially true when
extendibility is expected in a given generation of toolset. With this said, there is a need to continually
improve the measurement uncertainty of the fleet until it can no longer meet the needed requirements at
which point new technology needs to be entertained. This paper explores new methods in analyzing
existing SPC monitor data to assess the measurement performance of the fleet and look for opportunities to
drive improvements. Long term monitor data from a fleet of overlay and scatterometry tools will be
analyzed. The paper also discusses using other methods besides SPC monitors to ensure the fleet stays
matched; a set of SPC monitors provides a good baseline of fleet stability but it cannot represent all
measurement scenarios happening in product recipes. The analyses presented deal with measurement
uncertainty on non-measurement altering metrology toolsets such as scatterometry, overlay, atomic force
microscopy (AFM) or thin film tools. The challenges associated with monitoring toolsets that damage the
sample such as the CD-SEMs will also be discussed. This paper also explores improving the monitoring
strategy through better sampling and monitor selection. The industry also needs to converge regarding the metrics used to describe the matching component of measurement uncertainty so that a unified approach is
reached regarding how to best drive the much needed improvements. In conclusion, there will be a
discussion on automating these new methods3,4 so they can complement the existing methods to provide a
better method and system for controlling and driving matching improvements in the fleet.
Two-dimensional dose and focus-error measurement technology for exposure tool management in half-pitch 3x generation
Show abstract
As design rule of semiconductor device is shrinking, pattern profile management is becoming more critical, then high
accuracy and high frequency is required for CD (Critical Dimension) and LER (Line Edge Roughness) measurements.
We already presented the technology to inspect the pattern profile variations of entire wafer with high throughput [1] [2].
Using the technology, we can inspect CD&LER variations over the entire wafer quickly, but we could not separate the
signal into CD and LER variations. This time, we measured the Stokes parameters, i.e., polarization status, in the
reflected light from defected patterns. As the result, we could know the behavior of the polarization status changes by
dose & focus defects, and we found the way to separate the signal into CD&LER variations, i.e. dose errors and focus
errors, from S2 & S3 of Stokes parameters. We verified that we were able to calculate the values of CD&LER variations
from S2 & S3 by the experiments. Furthermore, in order to solve the issue that many images are needed to calculate S2
& S3 values, we developed the new method to get CD&LER variations accurately in short time.
Increased uniformity control in a 45nm polysilicon gate etch process
Show abstract
As die feature sizes continue to decrease, advanced process control has become essential for controlling profile and CD
uniformity across the wafer. Gate CD variation must be suppressed by process optimization of lithography, photoresist
trim, and gate etch in order to achieve the demanding CD control tolerances. Currently, APC is used in the lithography
and etch processes for within wafer (WiW) and wafer-to-wafer (W2W) CD control. APC can make improvements in
process results, but there is still variation that needs to be further reduced. Analysis of the current lithography edge CD
showed that the variation trend transferred to the post-etch edge CD measurement. Additionally, the etch process created
variation in the edge CD independently of the lithography process. It can be challenging to compensate for the variations
in the etch process and such compensations degrade through pitch OPC. Multivariable control of the etch process can
reduce the need for compensations and, consequently, through pitch variation. A DOE was designed and run using the
production etch process as a center reference for the creation of a WiW etch control model. This control model was then
tested with a MATLAB based simulation program that simulates the etch production process sequence and the ability to
target the edge CD. This demonstration shows that through rigorous methodology a multivariate model can be created
for targeting both center CD (W2W) and edge CD (WiW) control, providing an opportunity at etch to reduce
compensation for the etch variations at litho, and to provide the capability at etch to compensate for both litho and etch
uniformity changes by wafer.
The measurement uncertainty challenge for the future technological nodes production and development
Show abstract
With the continuous shrinkage of dimensions in the semiconductor industry, the measurement uncertainty is becoming
one of the major component that have to be controlled in order to guarantee sufficient production yield for the next
technological nodes production. Thus, CD-SEM and Scatterometry techniques have to face new challenges in term of
accuracy and subsequently new challenges in measurement uncertainty that were not really taken into account at the
origin of their introduction in production.
In this paper, we will present and discuss results about the accuracy requirements related to key applications for
advanced technological nodes production. Thus, we will present results related to OPC model precision improvement by
using suitable reference metrology model based on the 3D-AFM technique use. An interesting study related to 193 resist
shrinkage during CD-SEM measurement will be also presented and therefore the impact on measurement uncertainty
will be discussed. Finally we will conclude this paper by showing the potential industrial benefits to use a simple but
relevant 3D-AFM reference metrology model use into the semiconductor production environment.
Scatterometry II
Angle-resolved scatterfield microscope for linewidth measurement
Show abstract
Angle-resolved scatterfield microscope (ARSM) is developed for several years. It combines the optical microscope
and angle-resolved scatterometer with a relay lens and an aperture. In our research, the spatial light modulator (SLM)
is used to instead of the relay lens and the aperture. In the SLM, the phase modulation is used to simulate the Fresnel
lens, and then an incident plane wave is modulated and focused on the back focal plane of the objective lens. A plane
wave with an angle which is according to the position of focused point on the back focal plane is emitted from the
entrance pupil of the objective lens. By modulating the SLM, the angle of plane wave from the objective lens can be
changed. In our system, an objective lens with NA 0.95 and the magnification of 50 is used for wide angle scan.
A bare silicon wafer and a grating with the pitch of 417nm are measured with full-angle scan. By using the SLM, the
advantage is full-optical modulation, that is, the mechanical motion is not needed in the ARSM. Thus, the system
will have higher throughput and stabilization.
Optical CD metrology model evaluation and refining for manufacturing
Show abstract
Optical critical dimension (OCD) metrology has been well-accepted as standard inline metrology tool in
semiconductor manufacturing since 65nm technology node for its un-destructive and versatile advantage. Many
geometry parameters can be obtained in a single measurement with good accuracy if model is well established and
calibrated by transmission electron microscopy (TEM). However, in the viewpoint of manufacturing, there is no
effective index for model quality and, based on that, for model refining. Even, when device structure becomes more
complicated, like strained silicon technology, there are more parameters required to be determined in the afterward
measurement. The model, therefore, requires more attention to be paid to ensure inline metrology reliability. GOF
(goodness-of-fitting), one model index given by a commercial OCD metrology tool, for example, is not sensitive enough
while correlation and sensitivity coefficient, the other two indexes, are evaluated under metrology tool noise only and not
directly related to inline production measurement uncertainty. In this article, we will propose a sensitivity matrix for
measurement uncertainty estimation in which each entry is defined as the correlation coefficient between the
corresponding two floating parameters and obtained by linearization theorem. The uncertainty is estimated in
combination of production line variation and found, for the first time, much larger than that by metrology tool noise
alone that indicates model quality control is critical for nanometer device production control. The uncertainty, in
comparison with production requirement, also serves as index for model refining either by grid size rescaling or structure
model modification. This method is verified by TEM measurement and, in the final, a flow chart for model refining is
proposed.
Uncertainty and sensitivity analysis and its applications in OCD measurements
Show abstract
This article describes an Uncertainty & Sensitivity Analysis package, a mathematical tool that can be an
effective time-shortcut for optimizing OCD models. By including real system noises in the model, an accurate method
for predicting measurements uncertainties is shown. The assessment, in an early stage, of the uncertainties, sensitivities
and correlations of the parameters to be measured drives the user in the optimization of the OCD measurement strategy.
Real examples are discussed revealing common pitfalls like hidden correlations and simulation results are compared
with real measurements. Special emphasis is given to 2 different cases: 1) the optimization of the data set of multi-head
metrology tools (NI-OCD, SE-OCD), 2) the optimization of the azimuth measurement angle in SE-OCD. With the
uncertainty and sensitivity analysis result, the right data set and measurement mode (NI-OCD, SE-OCD or NI+SE OCD)
can be easily selected to achieve the best OCD model performance.
Reference Metrology
AFM method for sidewall measurement through CNT probe deformation correction and its accuracy evaluation
Show abstract
To use atomic force microscope (AFM) to measure dense patterns of 32-nm node structures, there is a difficulty in
providing flared probes that go into narrow vertical features. Using carbon nanotube (CNT) probes is a possible
alternative. However, even with its extremely high stiffness, van der Waals attractive force from steep sidewalls bends
CNT probes. This probe deflection effect causes deformation (or "swelling") of the measured profile. When measuring
100-nm-high vertical sidewalls with a 24-nm-diameter and 220-nm-long CNT probe, the probe deflection can cause a
bottom CD bias of 13.5 nm. This phenomenon is inevitable when using long, thin probes whichever scanning method is
used. We proposed a method to deconvolve this probe deflection effect. By detecting torsional motion of the base
cantilever for the CNT probe, it is possible to estimate the amount of CNT probe deflection. Using this information, we
have developed a technique for deconvolving the probe deformation effect from measured profiles. This technique, in
combination with deconvolution of the probe shape effect, enables vertical sidewall profile measurement.
We have quantitatively evaluated the performance of the proposed method using an improved version of a "tip
characterizers" developed at the National Institute of Advanced Industrial Science and Technology (AIST), which has a
well-defined high-aspect-ratio line and space structure with a variety of widths ranging from 10 to 60 nm. The critical
dimension (CD) values of the line features measured with the proposed AFM method showed good matches to TEMcalibrated
CD values. The biases were within a range of ±1.7 nm for combinations of three different probes, five
different patterns, and two different threshold heights, which is a remarkable improvement from the bias range of ±4.7
nm with the conventional probe tip shape deconvolution method. The static repeatability was 0.54 nm (3σ), compared to
1.1 nm with the conventional method. Using a 330-nm-deep tip characterizer, we also proved that a 36-nm-narrow
groove could be clearly imaged.
Poster Session
Alignment method of self-aligned double patterning process
Show abstract
Double patterning technology (DPT) is the best alternative to achieve 3x NAND flash node by 193nm immersion
lithography before entering EUV regime. Self-aligned double patterning (SADP) process is one of several DPT
approaches, and most likely be introduced into NAND flash manufacture. The typical single exposure process in
40nm node flash will become into multiple exposure job in 32nm node by DPT or SADP, and the overlay control
among these multiple exposure will be highly restricted than single exposure process. To reach tight overlay spec.
mainly relies on the contribution of alignment system of scanner, but the well alignment mark design with high
contrast signal and outstanding sustainability are essential factor as well. Typically, the feature size patterned in
SADP around 3x nm that is too narrow to form essential signals that is qualified to be the alignment mark and the
overlay mark either. This paper, we will discuss 1. the performance of alignment algorithm on direct alignment and
indirect alignment 2. different alignment mark design and 3. film scheme dependence (layer dependence). And
experiment result show the new mark design performs sufficient contrast and signal for subsequent layer aligning
process.
Application results of lot-to-lot high-order overlay correction for sub-60-nm memory device fabrication
Show abstract
According to the international technology roadmap for semiconductors 2007 overlay error should be controlled
under 12 nm for sub-60-nm memory devices. To meet such a tight requirement, lot-to-lot high order overlay correction
(HOC) is evaluated for the gate and contact layers of dynamic random access memory. A commercial package of HOC is
available from scanner makers such as ASML, Canon, and Nikon. Note that only wafer corrections are investigated for
this particular experiment. Reticle corrections are excluded. Experimental results verify 1 to 3 nm of overlay
improvement by applying HOC. However, the amount of improvement is layer (process) dependent. It turned out that
HOC is not an overall solution. It should be applied carefully for a certain process conditions. Detailed experimental
results are discussed.
Fast mask CD uniformity measurement using zero order diffraction from memory array pattern
Show abstract
CD Uniformity (CDU) control is getting more concerning in lithographic process and required to control tighter as
design rule shrinkage. Traditionally CDU is measured through discrete spatial sampling based data and interpolated data
map represents uniformity trends within shot and wafer. There is growing requirement on more high sampling resolution
for the CDU mapping from wafer. However, it requires huge time consumption for CD measurements with traditional
methods like CD-SEM and OCD. To overcome the throughput limitation, there was an approach with inspection tool to
measure CD trends on array area which showed good correlation to the traditional CD measurement. In this paper, we
suggest a fast mask CD error estimation method using 0th order of diffraction. To accomplish fast measurement, simple
macro inspection tool was adopted to cover full wafer area and scan result gives good correlation with mask uniformity
data.
Requirements of the inspection for double patterning technology reticles
Show abstract
As the design rule shrinks continuously, a reticle inspection is getting harsh and harsh and is now one of the most
critical issues in the mask fabrication process. The reticle inspection process burdens the entire mask process with the
inspectability and detectability problems. Not only aggressive assist features but also small and dense main features
themselves may cause many false detection alarms or nuisance defects, which makes the inspection TAT (Turn-around
Time) longer. Moreover, small and dense patterns inspections always come with the defect detectability issues.
Detectability of a defect in small and dense patterns is usually inferior to the printability of it because of the high MEEF
(Mask Error Enhancement Factor) resulted by those small and dense patterns.
Double Patterning Technology (DPT)[1] can relief the pattern pitch effectively, therefore, DPT reticle pattern can
have a larger pitch than normal Single Patterning Technology (SPT) reticle. We investigate the effect of this pitch
relaxation of DPT reticle on the inspection process.
In this paper, we compare and analyze the difference of pattern inspectability and defect detectability between DPT
reticles and SPT reticles when they have same size of patterns on them. In addition to these results, we also
investigate the printability of defects in comparison with the detectability and derive the requirement of the inspection
for 4x nodes DPT reticles from the results.
Sensitivity improvement and noise reduction of array CD mapping on memory device using inspection tool
Show abstract
Array CD uniformity can be measured by inspection tool and showed good correlation to traditional CD measurement
such as CD-SEM and OCD.[1] Due to the inspection tool's basic requirement which collects information over whole area
of wafer, the CD mapping from inspection images results in high spatial details within shot and along the wafer scale.
However the reflected light which comes from the interaction between sub-wavelength array pattern and illuminated
light isn't only responsible for CD variation of the illuminated area pattern. Other than lateral CD differences, thickness
variation of pattern and under layer films also result in light intensity changes on reflected light. Therefore the noise
separation other than CD variation is crucial factor on CD mapping using inspection tool. On the other hand, the
sensitivity of CD variation is dependent on the patterned layer materials and how it interacts with the polarization of
illuminated light.
From previous study, reflection light from sub wavelength array structure contains CD variation information and gives
linear response to the structure volume change. In this paper, CD test box which has intentional CD variation is
introduced to investigate on various parameters that result in reflectivity changes. Wavelength, polarization and optical
property of patterned structure are conducted to analyze the influence to the reflectivity signal. In parallel the
experimental results are compared with simulation result using RCWA and good correlation is achieved.
Analysis of systematic errors in lateral shearing interferometry for EUV optical testing
Show abstract
Lateral shearing interferometry (LSI) provides a simple means for characterizing the aberrations in optical
systems at EUV wavelengths. In LSI, the test wavefront is incident on a low-frequency grating which causes
the resulting diffracted orders to interfere on the CCD. Due to its simple experimental setup and high photon
efficiency, LSI is an attractive alternative to point diffraction interferometry and other methods that require
spatially filtering the wavefront through small pinholes which notoriously suffer from low contrast fringes and
improper alignment. In order to demonstrate that LSI can be accurate and robust enough to meet industry
standards, analytic models are presented to study the effects of unwanted grating and detector tilt on the system
aberrations, and a method for identifying and correcting for these errors in alignment is proposed. The models are
subsequently verified by numerical simulation. Finally, an analysis is performed of how errors in the identification
and correction of grating and detector misalignment propagate to errors in fringe analysis.
Haze generation model and prevention techniques for sulfate free cleaned mask
Show abstract
Although sulfate free cleaning has reduced number of residual ions on mask surface drastically, the lifetime of
photomask has improved marginally. New haze generation mechanism in sulfate free cleaning has been studied and
evaluated based on surface properties of photomask thin film materials. It was found that haze generation is co-related
with substrate surface properties as well as ionic re-combination under ArF illumination. Based on the haze generation
study, the surface modification treatment has been studied and investigated in the view of surface energy. The surface
modification treatment increases storage lifetime as well as cumulative haze threshold energy in wafer shops.
Resist-based polarization monitoring with phase-shift masks at 1.35 numerical aperture
Show abstract
Experimental verification of Phase Shift Mask (PSM) Polarimetry is provided at numerical apertures up to 1.35.
Promising initial results of periodic monitoring of a few polarized illuminators are illustrated and track with a scanner
on-board technique to within a fraction of a percent. Earlier publications have introduced the concept and provided
experimental validation up to 0.93NA. This paper discusses a variety of design improvements to improve the usability,
flexibility and robustness of this technique at NAs up to 1.35. The specialized test reticle, which consists of a large
number of polarization-sensitive chromeless phase shifting patterns, was successfully fabricated using a commercial
mask shop. Polarization sensitivity has improved by up to 7x when compared to two earlier generation reticles, helping
to minimize the impact of experimental noise. Various use models, the experimental flow, and details on the
experimental procedure are provided. It is concluded that this resist-based method can serve as a highly sensitive
polarization monitoring system for all hyper-NA applications.
An investigation of perfluoroalkylamine contamination control
Show abstract
Perfluoroalkylamine (PFAA) contamination has been observed to be a growing contamination issue in
semiconductor fabs due to the increased tolerances demanded by next generation lithography processes. These
contaminants are found within the cleanroom, process environments and within various tools and enclosures. As a
result, effective control of PFAAs in lithographic processing and other critical environments are becoming an
important filtration problem. This work was undertaken to evaluate filter performance for the removal of PFAAs and
to provide an optimal filtration solution. The filter breakthrough performance of perfluorotributylamine (PFTBA), a
common heat transfer fluid, is investigated over a range of environmental conditions for several commercially
available adsorbent materials. The results from these accelerated tests indicate that PFAAs can be effectively
removed with activated carbon-based chemical filters.
Sub-nanometer broadband measurement of elastic displacements in optical metrology frames and other critical elements
Show abstract
This paper presents the outline for a real-time nano-level elastic deformation measurement system for high precision
optical metrology frames. Such a system is desirable because elastic deformation of metrology frame structures is a
leading cause for performance degradation in advanced lithography as well as metrology and inspection equipment. To
date the development of such systems was thwarted by the unavailability of sufficiently sensitive and cost effective strain
sensors. The recent introduction to the market of the IntelliVibeTM S1 strain sensors with sub nanostrain sensitivity
makes it possible to develop a real-time nanometer level elastic deformation monitoring system. In addition to the
sufficiently sensitive and cost-effective strain sensor it is necessary to develop the analytical foundation for the
measurement system and use this foundation for the development of a signal processing algorithm that will enable the
real-time reconstruction of the elastic deformation state of a metrology frame at any given time from data transmitted by
a reasonable number of properly placed S1 strain sensors. The analytical foundation and the resulting algorithms are
demonstrated in this paper.
Measurement of low molecular weight silicon AMC to protect UV optics in photo-lithography environments
Show abstract
A new analytical method for semiconductor-specific applications is presented for the accurate measurement of low
molecular weight, silicon-containing, organic compounds TMS, HMDSO and D3.
Low molecular weight / low boiling point silicon-containing compounds are not captured for extended periods of time by
traditional chemical filters but have the same potential to degrade exposure tool optical surfaces as their high molecular
weight counterparts. Likewise, we show that capturing these compounds on sample traps that are commonly used for
organic AMC analysis does not work for various reasons.
Using the analytical method described here, TMS, HMDSO and D3 can be measured artifact-free, with at least a 50:1
peak-to-noise ratio at the method detection limit, determined through the Hubaux-Vos method and satisfying a
conservative 99% statistical confidence. Method detection limits for the compounds are 1-6 ppt in air. We present
calibration curve, capacity, capture efficiency, break-through and repeatability data to demonstrate robustness of method.
Seventy-one real-world samples from 26 projects taken in several fab environments show that TMS is found in
concentrations 100 times higher than those of HMDSO and D3. All compounds are found in all environments in
concentrations ranging from zero to 12 ppm, but most concentrations were below 50 ppb. All compounds are noticeably
higher in litho-bays than in sub-fabs and we found all three compounds inside of two exposure tools, suggesting
cleanroom and/or tool-internal contamination sources.
Positive identification of lithographic photoresists using real-time index of refraction monitoring for reduced cost of ownership
Show abstract
This study involved installation of a real-time refractive index monitoring system into a simulated photoresist feed line as
would be used for delivery to a lithography tool. The goal was to determine whether this refractive index technology
could be used to differentiate among all possible photoresists that could potentially be delivered to a lithography tool and
resolve each one, separate from the others. The main intention is to use this technology to prevent the wrong photoresist
from being used. The use of the wrong photoresist could result in hundreds of thousands of dollars per year in wafers
scrapped in late stages of the process flow. A Swagelok(R) CR-288(R) concentration monitor, using refractive index (RI)
sensing, was installed into a simulated photoresist feed line to monitor and detect each one in real time. By integrating
the CR-288 concentration monitor into the lithographic process system, the capability for uniquely identifying and
resolving 10 out of 10 Deep Ultra Violet (DUV) photoresists was demonstrated, potentially leading to a large cost
avoidance and reduced cost of ownership.
Sub-50-nm pitch size grating reference for CD-SEM magnification calibration
Show abstract
We have developed a novel multi-layer grating pattern with a sub-50-nm pitch
size for CD-SEM magnification calibration instead of the conventional 100-nm pitch
grating reference. The sub-50-nm pitch size grating reference was fabricated by
multi-layer deposition of alternative two alternating materials and then the
material-selective chemical etching of the cleaved cross-sectional surface. A line and
space pattern with 10-nm pitch size was easily resolved and a high-contrast secondary
electron image of the grating pattern was obtained under 1-kV acceleration voltage
using CD-SEM. The uniformity of the 20-nm pitch size of the grating was less than 1
nm in 3σ. The line edge roughness of the grating pattern was also less than 1 nm. Such a
fine and uniform grating pattern will fulfill the requirements of a magnification
calibration reference for next-generation CD-SEM.
Effective purging solution to reticle haze formation
Wei-Jui Tseng,
Shean-Hwan Chiou,
Ming-Chien Chiu,
et al.
Show abstract
The control of haze contamination on reticles has been gaining an ever-increasing focus because of its contribution to the
huge yield loss in semiconductor manufacturing. Yield improvement through the reduction of haze on reticles has been a
significant challenge as the use of 193nm light source and the shrinkage of line width on reticles. For a mass production IC
manufacturing fab, an easy and practical solution is needed to prevent haze generation. In our previous study (Tseng et al.,
2008), we demonstrated a practical and effective solution to reticle haze formation at a mass production DRAM factory.
After implementing this solution, the number of wafers printed without haze development on reticles can be up to 150,000
wafers, and the maximum exposure dosage can be up to 9×108 mJ/cm2 without the detection of any printable haze. Using
the average data from more than 20 reticles, the average wafer printed before cleaning of reticle was more than 100,000
wafers. This solution has been proven to be effective in reducing the generation of haze on reticles.
In current study, our focus is on further improvement of this haze solution and the ultimate goal is to reduce the haze
generation effectively, but also economically. First, we use ultra low outgas material, antistatic PEEK, as the material of
reticle carrier to perform the study and investigate its effect on haze generation. The total outgas data, leaching, electrical
field shielding, and surface resistance data of different polymer materials are also compared. Secondly, we optimize the
purging flow rate to reduce the running cost, but also maintain the performance. Our approach is to design purge nozzles,
which can create a smooth flow field inside reticle SMIF pod (RSP) and make the maintenance of an ultra clean RSP
environment with the smallest flow rate be possible. The results show the PEEK RSP with newly designed purge nozzles
can provide great haze prevention result with a lower flow rate. Detailed data is provided and compared with previous
design. By using this new solution, the number of wafers printed without haze development on reticles can be up to
300,000 wafers, and the maximum exposure dosage can be up to 1.2×109 mJ/cm2 without the detection of any printable haze.
The average wafer printed before cleaning of reticle was more than 170,000 wafers. This is a significant improvement to
delay the generation of haze on reticles. The comparison of N2 / XCDA performance based on wafer exposure shows that
no significant difference can be observed.
Measurement of dimensions of resist mask elements below 100 nm with help of a scanning electron microscope
Show abstract
We studied the effect of focusing of the electron probe of a scanning electron microscope (SEM), operating in the mode
of collection of slow secondary electrons, on the form of a signal obtained when scanning elements of nanorelief of two
kinds of objects with electron probe: (a) resist masks, and (b) protrusions and trenches on silicon. The shift of the
positions of the points of reference, the distance between which is usually used to determine the size of the relief
elements, was observed. The linear dependence of such distance on the size of the electron probe was found. We
propose a method to measure the width of the nanorelief element, based on the extrapolation of this linear dependence to
the zeroth size of the electron probe. With the help of this method, we measured the widths of nanorelief elements of
resist masks, as well as of protrusions and trenches on silicon.
Aerial imaging for FABs: productivity and yield aspects
Show abstract
The economy of wafer fabs is changing faster for 3x geometry requirements and below. Mask set and exposure tool
costs are almost certain to increase the overall cost per die requiring manufacturers to develop productivity and yield
improvements to defray the lithography cell economic burden. Lithography cell cost effectiveness can be
significantly improved by increasing mask availability while reducing the amount of mask sets needed during a
product life cycle. Further efficiency can be gained from reducing send-ahead wafers and qualification cycle time,
and elimination of inefficient metrology. Yield is the overriding die cost modulator and is significantly more
sensitive to lithography as a result of masking steps required to fabricate the integrated circuit. Thus, for productivity
to increase with minimal yield risk, the sample space of reticle induced source of variations should be large, with
shortest measurement acquisition time possible.
This paper presents the latest introduction of mask aerial imaging technology for the fab, Aera2TM for Lithography
with IntenCTM, as an enabler for efficient lithography manufacturing. IntenCD is a high throughput, high density
mask-based critical dimension (CD) mapping technology, with the potential for increasing productivity and yield in
a wafer production environment. Connecting IntenCD to a feed forward advance process control (APC) reduces
significantly the amount of traditional CD metrology required for robust wafer CD uniformity (CDU) correction and
increases wafer CD uniformity. This in turn improves the lithography process window and yield and contributes to
cost reduction and cycle time reduction of new reticles qualification.
Advanced mask technology has introduced a new challenge. Exposure to 193nm wavelength stimulates haze growth
on the mask and imposes a regular cleaning schedule. Cleaning eventually causes mask degradation. Haze growth
impacts mask CD uniformity and induce global transmission fingerprint variations. Furthermore, aggressive
cleaning may damage the delicate sub-resolution assist features. IntenCD based CDU fingerprint correction can
optimize the regular mask cleaning schedule, extending clean intervals therefore extending the overall mask life
span. This mask availability enhancement alone reduces the amount of mask sets required during the product life
cycle and potentially leads to significant savings to the fab.
This mask availability enhancement alone reduces the amount of mask sets required during the product life cycle
and leads to significant savings to the fab.
In this paper we present three case studies from a wafer production fab and a mask shop. The data presented
demonstrates clear productivity and yield enhancements. The data presented is the outcome of a range of new
applications which became possible by integrating the recently introduced Applied Materials Aera2TM for
Lithography aerial imaging inspection tool with the litho cluster.
Improved mask-based CD uniformity for gridded-design-rule lithography
Show abstract
The difficulties encountered during lithography of state-of-the-art 2D patterns are
formidable, and originate from the fact that deep sub-wavelength features are being
printed. This results in a practical limit of k1 ≥0.4 as well as a multitude of complex
restrictive design rules, in order to mitigate or minimize lithographic hot spots. An
alternative approach, that is gradually attracting the lithographic community's
attention, restricts the design of critical layers to straight, dense lines (a 1D grid), that
can be relatively easily printed using current lithographic technology. This is then
followed by subsequent, less critical trimming stages to obtain circuit functionality.
Thus, the 1D gridded approach allows hotspot-free, proximity-effect free lithography
of ultra low- k1 features. These advantages must be supported by a stable CD control
mechanism. One of the overriding parameters impacting CDU performance is photo
mask quality. Previous publications have demonstrated that IntenCDTM - a novel,
mask-based CDU mapping technology running on Applied Materials' Aera2TM aerial
imaging mask inspection tool - is ideally fit for detecting mask-based CDU issues in
1D (L&S) patterned masks for memory production. Owing to the aerial nature of
image formation, IntenCD directly probes the CD as it is printed on the wafer.
In this paper we suggest that IntenCD is naturally fit for detecting mask-based CDU
issues in 1D GDR masks. We then study a novel method of recovering and
quantifying the physical source of printed CDU, using a novel implementation of the
IntenCD technology. We demonstrate that additional, simple measurements, which
can be readily performed on board the Aera2TM platform with minimal throughput
penalty, may complement IntenCD and allow a robust estimation of the specific
nature and strength of mask error source, such as pattern width variation or phase
variation, which leads to CDU issues on the printed wafer. We finally discuss the
roles played by IntenCD in advanced GDR mask production, starting with tight
control over mask production process, continuing to mask qualification at mask shop
and ending at in-line wafer CDU correction in fabs.
Study of advanced mask inspection optics with super-resolution method for next-generation mask fabrication
Show abstract
The lithography potential of an ArF (193nm) laser exposure tool with high numerical aperture (NA) will expand its
lithography potential to 45nm node production and even beyond. Consequently, a mask inspection system with a
wavelength nearly equal to 193nm is required so as to detect defects of the masks using resolution enhancement
technology (RET). A novel high-resolution mask inspection platform using DUV wavelength has been developed, which
works at 199nm. The wavelength is close to the wavelength of ArF exposure tool. In order to adapt 199nm optics for
hp2x nm node and beyond defect detection on next generation mask with appropriate condition, further development
such as the illumination condition modification technique has been studied. The illumination optics has the advantageous
feature that super-resolution method is applied by adding the optics. To evaluate the super-resolution effect of
illumination condition control optics, the interaction of light with mask features is calculated rigorously using RCWA
(Rigorous Coupled-Wave Analysis) method.
In this paper, image contrast enhancement effect using newly designed super-resolution optics which is applied to
transmitted and reflected light image acquisition system are presented with simulation and experiment.
Novel lithography approach using feed-forward mask-based wafer CDU correction increase fab productivity and yield
Show abstract
The extension of ArF lithography through reduced k1, immersion and double patterning techniques makes lithography a
difficult challenge. Currently, the concept of simple linear flow from design to functional photo-mask is being replaced
by a more complex scheme of feedback and feed-forward loops which have become part of a complex computational
lithography scheme. One such novel lithography concept, called "holistic lithography", was recently introduced by
ASML, as a scheme that makes the lithography process a highly efficient solution for the scaled down geometries. This
approach encourages efficient utilization of computational lithography and the use of feed-forward and feed-back critical
dimension (CD) and overlay correction loops. As sub-nanometer feature dimensions are reached for 3x nodes, with k1
reaching the optics limitations, Mask error enhancement factor (MEEF) values grow fast, thus making mask uniformity
fingerprint and degradation throughout its life time a significant factor in printed CDU on the wafer. Whereas the
consensus is on the need for growing density of intra-field data, traditional critical dimension scanning electron
microscope (CDSEM) Feed backward loops to the litho-cell become unsuitable due to the high density CD measurement
requirements. Earlier publications proposed implementing the core of the holistic lithography concept by combining two
technologies: Applied Material's IntenCDTM and ASML DoseMapper . IntenCD metrology data is streamed in a feedforward
fashion through DoseMapper and into the scanner, to create a dose compensation recipe which improves the
overall CDU performance. It has been demonstrated that the IntenCD maps can be used to efficiently reduce intra-field
printed CDU on printed wafers.
In this paper we study the integration concept of IntenCD and DoseMapper in a production environment. We implement
the feed-forward concept by feeding IntenCD inspection data into DoseMapper that is connected to ASML's
TWSINCANTM XT:1900i scanner. We apply this concept on printed wafers and demonstrate significant reduction in
intra-field CDU. This concept can effectively replace the feedback concept using send-ahead wafers and extensive
CDSEM measurements. The result is a significant cost saving and fab productivity improvement. By routinely
monitoring mask-based CDU, we propose that all photo-induced transmission degradation effects can be compensated
through the same mechanism. The result would be longer intervals between cleans, improved mask lifetime, and better
end of line device yield.
CD-bias reduction in CD-SEM line-width measurement for the 32-nm node and beyond using the model-based library method
Show abstract
The measurement accuracy of critical-dimension scanning electron microscopy (CD-SEM) at feature sizes of 10 nm and
below is investigated and methods for improving accuracy and reducing CD bias (the difference between true and
measured CD values) are proposed. Simulations indicate that CD bias varies with feature size (CD) when the electron
scatter range exceeds the CD. As the change in the CD-SEM waveform with decreasing CD is non-uniform, the CD bias
in the results is strongly dependent on the algorithm employed to process the CD-SEM data. Use of the threshold method
with a threshold level equal to 50% (Th = 50%) is shown to be effective for suppressing the dependence of CD bias on
CD. Through comparison of experimental CD-SEM measurements of silicon line patterns (7-40 nm) with atomic force
microscopy (AFM) measurements, it is confirmed that the threshold method (Th = 50%) is a effective as predicted,
affording a largely invariant CD bias. The model-based library (MBL) method, which is theoretically capable of
eliminating CD bias, is demonstrated to reduce the CD bias to near-zero levels. These experiments demonstrate the
feasibility of next-generation CD-SEM for the measurement of feature sizes of the order of 10 nm and smaller.
New approach for mask-wafer measurement by design-based metrology integration system
Show abstract
OPC (Optical Proximity Correction) technique is getting more complicated towards 32 nm technology node and beyond,
i.e. from moderate OPC to aggressive OPC. Also, various types of phase shift mask have been introduced, and their
manufacturing process is complicated. In order to shorten TAT (Turn around time), mask design technique needs be
considered in addition to lithography technique.
Furthermore, the lens aberration of the exposure system is getting smaller, so its current performance is very close to the
ideal. On the other hand, when down sizing of device feature size reaches the 32nm technology node, cases begin to be
reported where the feature dimension is not matched between a mask pattern and the corresponding printed pattern.
Therefore, it is indispensable to understand the pattern size correlation between a mask and the corresponding printed
wafer in order to improve the processing accuracy and the quality in the situation where the device size is so small that
the low k1 lithography is widely used in production.
One of the approaches to improve the estimated accuracy of lithography is the use of contour data extracted from mask
SEM image in addition to the application of a mask model.
This paper describes a newly developed integration system that aims to solve the issues above, and its applications. This
is a system that integrates mask CD-SEM (Critical Dimension-Scanning Electron Microscope) CG4500, wafer CD-SEM
CG4000, OPC evaluation system DesignGauge, all manufactured by Hitachi High-Technologies.
The measurement accuracy improvement was examined by executing a mask-wafer same point measurement, i.e.
measurement of the corresponding points, with same measurement algorithm utilizing the new system. First, we
measured mask patterns and verified the validity based on the measurement value, the image, the measurement
parameter and the coordinates. Then a job file was formulated for a wafer CD-SEM using the new system so as to
measure the corresponding patterns that were exposed using the mask. In addition, the average CD measurement was
tried in order to improve the capability.
Furthermore, in order to estimate the pattern shape with high accuracy, a contour was calculated from a mask SEM
image, and the result was used with the design data in a litho simulation. This realizes a verification that includes mask
fabrication error.
This system is expected to be beneficial for both mask makers and device makers.
A practical application of multiple parameters profile characterization (MPPC) using CD-SEM on production wafers using Hyper-NA lithography
Show abstract
With the improved resolution of immersion lithography by Hyper-NA (Numerical Aperture) and Low-k1 scaling factor,
lithographers face the problem of decreasing Depth of Focus and in turn reduced process latitude. It is important for
high precision process monitoring the decrease in process latitude which comes with Hyper-NA and Low-k1, in order to
be able to successfully introduce RET (Resolution Enhancement Techniques) lithography into high volume production.
MPPC (Multiple Parameters Profile Characterization) is a function which provides the ability to extract pattern shape
information from a measured e-beam signal. MPPC function becomes key technique of pattern profile verification by
top down SEM images for the Hyper-NA lithography, for that reason it can be detected to relate to pattern profile
change.
In this work, we explored a practical application of MPPC function by making clear the relationship between MPPC
indices and Litho parameters concerning specific lithography application. We performed the two kinds of experiment
for verifying effectiveness of the MPPC function. First experiment explored printing image contrast by using the WB
with exposure pattern shape change related from image printing condition. Second experiment explored pattern shape
change due to resist contrast with changing the process conditions by using WB behavior. In consequence, we
demonstrated a practical application of MPPC function by quantification using WB and assessed the process monitoring
capability.
Our challenge of this research is the practical application of the MPPC function on production wafers concerning
specific lithography application. We believe that this application can be effectual in process monitoring and control for
Hyper-NA lithography.
Improving capability of recipe management on CD-SEM using recipe diagnostic tool
Kaoru Nishiuchi,
Shinichi Nakano,
Masaki Nishino,
et al.
Show abstract
In the semiconductor manufacturing industry, CD-SEM (Critical Dimension - Scanning Electron Microscope) is
used for CD measurements on semiconductor wafers. In the CD-SEM, there is a program called "recipe" which is a
series of files containing information on measurement conditions (i.e. where, how and what to measure). The recipe
controls all of the parameters such as auto focus, pattern recognition and measurement parameters. Depending on the
quality of this recipe, there is the possibility of errors in auto focus, pattern recognition and measurements. There are
many ways in which to improve the quality of the recipe for better productivity. One method to obtain a higher
quality recipe requires the use of a wafer and CD-SEM. When optimizing the recipe, the quality of improvements to
the recipe depend heavily on the skill of the engineer, and wafer and CD-SEM conditions. In the semiconductor
manufacturing Fab it is very time consuming to optimize recipe errors, as it requires wafer availability, arranging CDSEM
tool time, and analysis of root course of error.
This paper discusses a recipe diagnostic tool to evaluate and analyze the root cause of recipe errors. This tool can
provide not only analysis of the error root cause, but can also help the user determine how to improve the recipe
quality by pinpointing the problematic recipe parameters. This will allow the user to properly select the parameters
that need adjustment in order to obtain the best performance recipe possible. This method can reduce engineering
time for recipe control by a factor of 10.
Three-dimensional profile extraction from CD-SEM top-view image
Show abstract
Emerging three-dimensional (3D) transistor structures have increased the demand for an easy and practical method to
measure pattern feature metrics (such as CD, line-edge roughness, etc.) as a function of height (z coordinate). We have
examined 3D pattern-profile extraction from a top-view image obtained using a critical-dimension scanning electron
microscope (CD-SEM). An atomic force microscope (AFM) was used to measure 3D pattern profiles as a reference. In
this examination, line-edge positions were firstly obtained from a CD-SEM image at various threshold levels, and the
result was compared with the reference profile measured using the AFM. From this comparison, a mapping function
from threshold levels of CD-SEM image-processing to z coordinates is obtained. Using this mapping function, 3D
pattern profiles were reconstructed from CD-SEM signal profiles, and the obtained profiles were similar to the directly
obtained cross-sectional profile. Put simply, a 3D pattern-profile was extracted from a top-view image successfully.
Though the results are not sufficient to confirm the validity of our method yet, the method may feasibly be introduced
for quick and easy 3D measurement.
3D-AFM booster for mass-production nanoimprint lithography
Show abstract
As we are moving towards sub-32nm node, the question of lithography cost will play a key role with the introduction for
example of Double Patterning. The Nanoimprint lithography is one potential candidate that could become competitive.
Indeed, such technique is very promising and has already proven its ability of being potentially compatible with high
volume manufacturing at low cost in order to make advanced devices. To transfer such ability in a real industrial
environment, progresses have to be done to manufacture high resolution and accurate molds. To overcome this issue,
fine topological characterizations of both coated mold with anti-sticking layer and imprinted materials have to be
performed. Fabricated patterns have to be very well controlled in term of geometry quality, uniformity on the whole
wafer. Moreover, the defectivity of the imprint process must be understood and well controlled to introduce such
lithography process into the industrial environment.
In this paper, we will present some experiments that have been carried out with the 3D-AFM technology on Nanoimprint
molds and various imprinted wafers in order to understand more deeply either the advantages or drawbacks of this
emerging lithography technique. For instance we will discuss about the anti-sticking layer which must be applied on
mold before any imprint in order to keep reliable as much as possible the final industrial process. We will also present
experimental results realized for both UV-NIL and Hot-Embossing NIL which are two different candidates depending on
the final application. In a third part we will show and discuss some experimental results related to the Nanoimprint
defectivity main drawback through the study of capillarity bridges growing.
Simulation of secondary electron emission in helium ion microscope for overcut and undercut line-edge patterns
Show abstract
In order to study the topographic contrast of line-edge patterns in a scanning ion microscope (SIM) using
helium (He) beam, a Monte Carlo simulation of secondary electron (SE) emission from silicon (Si) by the impact of He
ions in the energy range of tens of keV is performed. The edges with overcut and undercut profiles for different sidewall
angles are modeled and the patterns are scanned by using 30 keV He ion beam, so that the line profiles of the SE
intensity are calculated assuming zero-sized beams. The results are compared with those of 30 keV Ga ion and 1 keV
electron beams. Furthermore, the pseudo-images of critical-dimension (CD) line patterns with different widths are
constructed from the SE profiles. The calculated SE yields of Si for 10-40 keV He ions increase with increasing impact
energy, which become larger than that for low-energy electrons (keV or less). When scanning the line edges formed on a
Si surface, there appear both large and sharp peak and small dip of the SE yield. The height of the peak is much more for
the He ion beam than the Ga ion and electron beams, whereas the width is less: the FWHMs are 3.8 nm for 30 keV Heion,
7.2 nm for 30 keV Ga-ion and 8.0 nm for 1 keV electrons. This indicates that the line edge is more clearly
distinguished by He ions. The change in the sidewall angle causes the change in the shape of the hump in the SE profile
at the sidewall of overcut edges due to the incident angle dependence of the SE yield, which is clearly seen for all beams.
However, much less change in the line profiles of undercut edges is found for Ga ion and electron beams.
Nonplanar high-k dielectric thickness measurements using CD-SAXS
Show abstract
Non-planar transistor architectures, such as tri-gates or "FinFETs", have evolved into important solutions to the severe
challenges emerging in thermal and power efficiency requirements at the sub-32 nm technology nodes. These
architectures strain traditional dimensional metrology solutions due to their complex topology, small dimensions, and
number of materials. In this study, measurements of the average dielectric layer thickness are reported for a series of
structures that mimic non-planar architectures. The structures are line/space patterns (≈ 20 nm linewidth) with a
conformal layer of sub-15 nm thick high-k dielectric. Dimensions are measured using a transmission X-ray scattering
technique, critical dimension small angle X-ray scattering (CD-SAXS). Our test results indicate that CD-SAXS can
provide high precision dimensional data on average CD, pitch, and high-k dielectric layer thickness. CD-SAXS results
are compared with analogous data from both top-down scanning electron microscopy and cross-sectional transmission
electron microscopy. In addition, we demonstrate the capability of CD-SAXS to quantify a periodic deviation in pitch
induced by an imperfection in the phase shift mask.
High-precision CD matching monitoring technology using profile gradient method for the 32-nm technology generation
Show abstract
Measurement uncertainty requirement 0.37 nm has been set for the Critical Dimension (CD) metrology tool in 32 nm
technology generation, according to the ITRS[1]. The continual development in the fundamental performance of Critical
Dimension Scanning Electron Microscope (CD-SEM) is essential, as in the past, and for this generation, a highly precise
tool management technology that monitors and corrects the tool-to-tool CD matching will also be indispensable.
The potential factor that strongly influences tool-to-tool matching is the slight difference in the electron beam
resolution, and its determination by visual confirmation is not possible from the SEM images. Thus, a method for
quantitative evaluation of the resolution variation was investigated and Profile Gradient (PG) method was developed. In
its development, considerations were given to its sensitivity against CD variation and its data sampling efficiency to
achieve a sufficient precision, speed and practicality for a monitoring function that would be applicable to mass
semiconductor production line. The evaluation of image sharpness difference was confirmed using this method.
Furthermore, regarding the CD matching management requirements, this method has high sensitivity against CD
variation and is anticipated as a realistic monitoring method that is more practical than monitoring the actual CD
variation in mass semiconductor production line.
CD budget analysis on sub-50-nm DRAM device: global CD variation to local CD variation
Show abstract
In this study, overall critical dimension (CD) error budget analysis procedure is proposed to estimate the source of CD
error. Until now, local CD variation has been treated as noise or uncertainty, since it has been considered not real, and it
is difficult to be measured. However, the actual measurement result shows the local CD variation occupies a significant
portion of overall CD variation. We included the local variation into overall CD budget analysis, and performed the
budget break-down of the local CD variation. This analysis was performed on the layers of a sub-50nm DRAM device.
We calculated local CD uniformity from CD SEM measurement data having multi-point measurement on each frame.
Metrology error of wafer CD SEM and mask CD SEM was measured, and local CD uniformity (CDU) of mask was also
measured. To estimate the impact from the mask local CDU, we performed simulation with a virtual mask shape which
has the same level of local variation with real mask. The remaining budget, except metrology and mask induced budget,
is treated as a process roughness. To predict the budget caused by process roughness, a randomly varying threshold map
was applied. In this approach, the local CD variation of 2-dimensional patterns is considered as an extension of the LWR
in 1-dimension.
Intrafield process control for 45 nm CMOS logic patterning
Show abstract
CMOS 45nm technology, and especially the logic gate patterning has led us to hunt for every nanometer we
could found to reach aggressive targets in term of overall CD budget. We have presented last year a paper ("Process
Control for 45 nm CMOS logic gate patterning " - B. Le Gratiet SPIE2008; 6922-33) showing the evaluation of our
process at that time. One of the key item was the intrafield control. Preliminary data were presented regarding intrafield
CD corrections using Dose MapperTM. Since then, more work has been done in this direction and not only for the GATE
level.
Depending on reticle specification grade, process MEEF and scanner performance, intrafield CD variation can
reach quite high CD ranges and become a non negligeable part of the overall budget. Although reticles can achieve very
good level of CD uniformity, they all have their own "footprint" which will becomes a systematic error. The key point
then is to be able to measure this footprint and correct for it on the wafer. Scanners suppliers provide tools like Dose
MapperTM to modify the intrafield exposure dose profile. Generating and using a proper exposure "subrecipe" requires
intrafield in-line control needs on production wafers. This paper present a status of our work on this subject with some
results related to global gate CMOS 45nm CD variability improvement including etch process compensation with Dose
Mapper.
Contour quality assessment for OPC model calibration
Show abstract
Site-based SEM measurements produce accurate OPC models in 180nm to 65nm technology nodes, but the lack of 2D
information has prompted for new calibration methods for sub 65nm designs. A hybrid technique using site-based SEM
measurements together with SEM contours has been developed to produce more accurate OPC models. Contour samples
account for 2D effects while CD sites provide high accuracy 1D measurements. SEM contours are prone to sampling
and processing errors as well as extensive calibration run time. We develop a method to filter out inferior samples prior
to model calibration to effectively decrease calibration runtime and increase model accuracy. Fitness and coverage
metrics are used to assess the quality of the contour data in order to select the best subset of the calibration contours.
Our results demonstrate a selection routine that consistently performs better than picking contours at random, and we
discuss the trade-offs between coverage, accuracy and runtime with respect to model quality.
Applications of AFM in semiconductor R&D and manufacturing at 45 nm technology node and beyond
Show abstract
Continuing demand for high performance microelectronic products propelled integrated circuit technology into 45 nm
node and beyond. The shrinking device feature geometry created unprecedented challenges for dimension metrology in
semiconductor manufacturing and research and development. Automated atomic force microscope (AFM) has been used
to meet the challenge and characterize narrower lines, trenches and holes at 45nm technology node and beyond. AFM is
indispensable metrology techniques capable of non-destructive full three-dimensional imaging, surface morphology
characterization and accurate critical dimension (CD) measurements. While all available dimensional metrology
techniques approach their limits, AFM continues to provide reliable information for development and control of
processes in memory, logic, photomask, image sensor and data storage manufacturing. In this paper we review up-todate
applications of automated AFM in every mentioned above semiconductor industry sector. To demonstrate benefits
of AFM at 45 nm node and beyond we compare capability of automated AFM with established in-line and off-line
metrologies like critical dimension scanning electron microscopy (CDSEM), optical scatterometry (OCD) and
transmission electronic microscopy (TEM).
WLCD: a new system for wafer level CD metrology on photomasks
Sven Martin,
Holger Seitz,
Wolfgang Degel,
et al.
Show abstract
With decreasing feature size, the requirements for CD uniformity (CDU) on the wafer have become crucial for achieving
the required yield in the wafer fab. This is related to tighter CDU specifications on the photomask. Currently, mask CDU
is mainly measured by mask CD SEM tools. However, due to strong OPC and high MEEF mask CDU is not directly
related to wafer CDU. A new Aerial Imaging based optical system has been developed by Carl Zeiss SMS which
measures wafer level CD already on photomasks under scanner conditions. First results of the alpha tool show that the
new tool has extremely good CD repeatability and stability. Furthermore, the effect of the scanner settings on CD
uniformity is demonstrated.
3D touch trigger probe based on fiber Bragg gratings
Show abstract
This paper presents a 3D touch trigger probe based on Fiber Bragg Grating (FBG). The sensing principle is Bragg
equation λ=2nΛ. Mutative strain and temperature outside alter both the refractive index (n) and grating pitches (Λ) of the
fiber core, so the Bragg wavelength λ will change accordingly. The probe adopts FBG sensor system which has four
FBGs provided with same parameter (three as sensor FBG and one as match FBG). Laser beam from broadband light
source enter sensor FBGs through one coupler, the reflected light is imported to match FBG via another coupler,
eventually captured by a high precision optoelectronic detector which monitors energy of the laser reflected by match
FBG. The tip ball swings when it contact work pieces, and causes rotation of the plank by rigid connection, the
displacement of the tip ball will be transferred to strain exerting on sensor FBGs. Consequently the strain results in
Bragg wavelength shift of the reflected laser beam. The displacement of the probe leads to shift of Bragg wavelength of
the sensor FBG, therefore, results in energy change of reflected light from the matching FBG. The probe based on FBG
sensor brings an untouched branch of the application of Fiber grating sensors. It is also studied on key points of a touch
trigger probe such as repeatability, trigger force and resolution.
In-die registration metrology on future-generation reticles
Show abstract
Today, mask metrology is performed on dedicated registration test patterns in the kerf area between active dies.
Accordingly, the measurement performance of the actual registration metrology system on these test patterns is very well
characterized. However, it is commonly understood that with the introduction of reticles for the 32nm technology node,
the overlay requirements will become more stringent and therefore reticles need to be characterized in greater detail. In
order to achieve the tighter overlay performance targets on the wafer, registration metrology on the mask is expected to
include "active" structures in the die. There will be more of an emphasis on In-Die metrology if Double Patterning
Lithography (DPL) will finally become the technology of choice for the 32nm lithography.
Measurement results are obtained on state-of-the-art registration metrology tools on test reticles simulating metrology in
the dense active array. These data are analyzed and compared with results achieved on test reticles using standard
registration test patterns.
Advanced modeling strategies to improve overlay control for 32-nm lithography processes
Show abstract
Overlay control is gaining more attention in recent years as technology moves into the 32nm era. Strict overlay
requirements are being driven not only by the process node but also the process techniques required to meet the design
requirements. Double patterning lithography and spacer pitch splitting techniques are driving innovative thinking with
respect to overlay control. As lithographers push the current capabilities of their 193nm immersion exposure tools they
are utilizing newly enabled control 'knobs'. 'Knobs' are defined as the adjustment points that add new degrees of
freedom for lithographers to control the scanner. Expanded control is required as current scanner capabilities are at best
marginal in meeting the performance requirements to support the ever demanding process nodes. This abstract is an
extension of the SPIE 2008 paper in which we performed thorough sources of variance analysis to provide insight as to
the benefits of utilizing high order scanner control knobs [1]. The extension this year is to expand the modeling
strategies and to validate the benefit through carefully designed experiments. The expanded modeling characterization
will explore not only high order correction capabilities but also characterize the use of field by field corrections as a
means to improve the overlay performance of the latest generation of immersion lithography tools. We will explore
various correction strategies for both grid and field modeling using KT AnalyzerTM.
Overlay mark optimization using the KTD signal simulation system
Show abstract
As the overlay performance and accuracy requirements become tighter, the impact of process parameters on the target
signal becomes more significant. Traditionally, in order to choose the optimum overlay target, several candidates are
placed in the kerf area. The candidate targets are tested under different process conditions, before the target to be used in
mass production is selected. The varieties of targets are left on the mass production mask and although they will not be
used for overlay measurements they still consume kerf real estate. To improve the efficiency of the process we are
proposing the KTD (KLA-Tencor Target Designer). It is an easy to use system that enables the user to select the
optimum target based on advanced signal simulation. Implementing the KTD in production is expected to save 30% of
kerf real estate due to more efficient target design process as well as reduced engineering time.
In this work we demonstrate the capability of the KTD to simulate the Archer signal in the context of advanced
DRAM processes. For several stacks we are comparing simulated target signals with the Archer100 signals. We
demonstrate the robustness feature in the KTD application that enables the user to test the target sensitivity to process
changes. The results indicate the benefit of using KTD in the target optimization process.
Fast analysis and diagnostics for improving overlay control: moving beyond the black box approach
Show abstract
Controlling overlay residuals to the lowest possible levels is critical for high yielding mass production success
and is one of the most pressing challenges for lithographers. In this paper, the authors will show how the use of certain
systematic diagnostic and analysis tools combined with a source of variance methodology can allow users to promptly
separate the overlay sources of error into different contributors and quickly make the proper corrections. This
methodology with the analysis tools provide a turnkey solution to help process and equipment engineers take fast
decisions and act quickly to overcome these overlay challenges, which is one of the key contributing factors to staying
ahead.
The study and simulation of high-order overlay control including field-by-field methodologies
Show abstract
Overlay continues to be one of the key challenges for photolithography in semiconductor manufacturing. It becomes
even more challenging due to the continued shrinking of the device node. The corresponding tighter overlay specs
require the consideration of new paradigms for overlay control, such as high-order control schemes and/or field-by-field
overlay control. These approaches have been demonstrated to provide tighter overlay control for design rule structures,
and can be applied to areas such as double patterning lithography (DPL), as well as for correcting non-linear overlay
deformation signatures caused by non-lithographic wafer processing. Previously we presented a study of high-order
control applied to high order scanner correction, high order scanner alignment, and the sampling required to support
these techniques. Here we extend this work, using sources of variation (SOV) techniques, and have further studied the
impact of field by field compensation. This report will show an optimized procedure for high order control using
production wafers and field by field control.
Sampling strategy: optimization and correction for high-order overlay control for 45nm process node
Show abstract
The tight overlay budgets required for 45nm and beyond make overlay control a very important topic. With the adoption
of immersion lithography, the incremental complexity brings much more difficulty to analyzing the source of variation
and optimizing the sampling strategy. In this paper, there will be a discussion about how the use of an advanced
sampling methodology and strategy can help to overcome this overlay control problem and insure sufficient overlay
information to be captured for effective production lot excursion detection as well as rework decision making. There
will also be a demonstration of the different correction methodologies to improve overlay control for dual-stage systems
in order to maximize the productivity benef its with minimal impact to overlay performance.
Automated overlay recipe setup in high-volume manufacturing: improving performance, efficiency, and robustness
Show abstract
As the semiconductor industry continues to drive toward smaller design nodes, overlay error budgets will continue to
shrink making metrology ever more challenging. Moreover, this challenge is compounded by the need to continue to
drive down costs and increase productivity, especially given the competitive and macro-economic landscape going
forward. In order to satisfy these two contradicting requirements, new ways of maintaining metrology tools and recipes
are needed. Traditionally, recipes are generated manually by operators or even metrology engineers, involving both tool
time and engineering resources. Furthermore, the influence of individual skill levels can lead to undesirable variations
and is a potential source of errors that could result in yield loss. By means of automatic recipe generation both
engineering and capital equipment resources can be minimized. Implementation of an automated recipe creation process
will also result in improved recipe integrity. In this study, we show a methodology of a highly automated recipe
generation for overlay measurements. We will outline the benefits of such an implementation and comment on the value
for all segments of the semiconductor industry as well as provide data from production fabs demonstrating these
capabilities and benefits.
Optimization of alignment strategy for metal layer on local interconnect integration
Show abstract
The influence of processing on wafer alignment is becoming an increasingly important issue. We need to improve an
overlay accuracy and alignment performance when design rule are reduce. Especially, the alignment of Metal layers
gets some process effects, then we should have prepared to prevent alignment error by the wafer loss and reducing
throughput.
Alignment of Metal 0 layer in the local interconnect integration is much affected by ILD thickness, resist coating
process, but also there are effective phase depth.
In this paper, new alignment strategy is presented by simulation of stack structure impact on alignment. We are able
to accomplish the increase of alignment signal intensity by new alignment strategy. In addition, we can be achieved
alignment robustness to process variation for M0 to M0C alignment of local interconnect.
Challenges of long-term process stability and solutions for better control
Show abstract
Maintaining the stability of all litho process parameters over time is crucial to ensuring consistent litho process yield
throughout the product lifetime. The sensitivity of litho process performance to variations in litho process parameters is
getting higher as processes use lower k1 and resist dimensions get smaller. The dependence of litho cell yield on a laser
parameter change was investigated through simulations of memory patterns for various k1 and process layers by varying
bandwidth control level of laser. The sensitivity of litho yield to laser bandwidth became higher when lower k1 imaging
was used. Different bandwidth control requirements were determined based on the difference in CD control requirement
of each layer as well as the difference in process window of the layout. Overall, tighter bandwidth control was required
as pattern size and k1 became smaller. Significant improvements in long term process stability were achieved after
implementation of low bandwidth variation operation at a production fab. Cymer's latest bandwidth control technology
fulfills bandwidth control requirement for the simulated 43nm DRAM case, which has 0.31 k1 with 1.35NA ArF
immersion lithography
Use of 3D metrology for process control
Show abstract
As device structures continue to shrink and new materials are introduced, Three Dimensional (3D) Metrology becomes
more important. The creation of 3D Metrology data is defined as the generation of statistically relevant 3D information
used in R&D and/or semiconductor manufacturing. Parameters of interest are: profile shape, side wall angle, material
properties, height of the structure, as well as the variation within a die, on the wafer, between wafers. In this paper we
will show how this information is used to calibrate process control systems in a semiconductor fab. Also, results will be
shown on how this information may be used to compare different types of Metrology equipment e.g. CD-SEM and
optical CD metrology techniques like scatterometry.
Certain applications, such as the generation of profile information for 55 nm dense contact holes in photoresist, require
new technology to minimize damage to the soft photoresist. A new technique called in-situ broadband argon cleaning
will be presented. Finally, the application of the argon column for protective coating deposition of sub-45 nm photoresist
lines will be discussed.
Track optimization and control for 32nm node double patterning and beyond
Show abstract
Given the increasingly stringent CD requirements for double patterning at the 32nm node and beyond the question arises
as to how best to correct for CD non-uniformity at litho and etch. For example, is it best to apply a dose correction over
the wafer while keeping the PEB plate as uniform as possible, or should the dose be kept constant and PEB CD tuning
used to correct. In this work we present experimental data, obtained on a state of the art ASML XT:1900Gi and Sokudo
RF3S cluster, on both of these approaches, as well as on a combined approach utilizing both PEB CD tuning and dose
correction.
Contact area as the intuitive definition of contact CD based on aerial image analysis
Show abstract
As feature sizes continue to diminish, optical lithography is driven into the extreme
low-k1 regime, where the high MEEF increasingly complicates the relationship
between the mask pattern and the aerial image. This is true in particular for twodimensional
mask patterns, which are by nature much more complicated than patterns
possessing one-dimensional symmetry. Thus, the intricacy of 2D image formation
typically requires a much broader arsenal of resolution enhancement techniques over
complex phase shift masks, including SRAFs and OPC, as well as exotic off-axis
illumination geometries. This complexity on the mask side makes the printability
effect of a random defect on a 2D pattern a field of rich and delicate phenomenology.
This complexity is reflected in the dispute over the CD definition of 2D patterns:
some sources use the X and Y values, while others use the contact area. Here, we
argue that for compact features, for which the largest dimension is not wider than the
PSF of the stepper optics, the area definition is the natural one. We study the response
of the aerial image to small perturbations in mask pattern. We show that any
perturbation creates an effect extending in all directions, thus affecting the area and
not the size in a single direction. We also show that, irrespective of the source of
perturbation, the aerial signal is proportional to the variation in the area of the printed
feature. The consequence of this effect is that aerial inspection signal scales linearly
with the variation of printed area of the tested feature.
Focus and dose control for high-volume manufacturing of semiconductor
Show abstract
We have proposed a new inspection method of in-line focus and dose control for high-volume manufacturing of
semiconductor. And we have referred to this method as "Focus and Dose Line Navigator (FDLN)". This method can
raise a performance of semiconductor exposure tool and therefore they can go up a yield ratio of semiconductor
device. The method leads the exposure condition (focus and dose) to the center of process window. FDLN calculates
correct exposure condition using the technology of solving the inverse problem. The sequence involves following
process. 1) Creating a focus exposure matrix (FEM) on a test wafer for building some models as supervised data.
The models mean the relational equation between the multi measurement results of resist patterns (e.g. Critical
dimension (CD), height and sidewall angle) and exposure conditions of FEM. 2) Measuring the resist patterns on
production wafers and feeding the measurement data into the library to predict focus and dose. In this time, we have
evaluated the accuracy of FDLN. We made some sample wafers by Canon's exposure tool "FPA-7000AS7". And
we used Veeco's advanced CD-AFM "InSight" as a topography measurement tool.
Efficient use of design-based binning methodology in a DRAM fab
Show abstract
It is a well established fact that as design rules and printed features shrink, sophisticated
techniques are required to ensure the design intent is indeed printed on the wafer. Techniques of this
kind are Optical Proximity Correction (OPC), Resolution Enhancement Techniques (RET) and DFM
Design for Manufacturing (DFM). As these methods are applied to the overall chip and rely on
complex modeling and simulations, they increase the risk of creating local areas or layouts with a
limiting process window. Hence, it is necessary to verify the manufacturability (sufficient depth of
focus) of the overall die and not only of a pre-defined set of metrology structures. The verification
process is commonly based on full chip defect density inspection of a Focus Exposure Matrix (FEM)
wafer, combined with appropriate post processing of the inspection data. This is necessary to avoid
time consuming search for the Defects of Interest (DOI's) as defect counts are usually too high to be
handled by manual SEM review. One way to post process defect density data is the so called design
based binning (DBB). The Litho Qualification Monitor (LQM) system allows to classify and also to
bin defects based on design information. In this paper we will present an efficient way to combine
classification and binning in order to check design rules and to determine the marginal features
(layout with low depth of focus).
The Design Based Binning has been connected to the Yield Management System (YMS) to allow
new process monitoring approaches towards Design Based SPC. This could dramatically cut the
time to detect systematic defects inline.
Proximity matching for ArF and KrF scanners
Show abstract
There are many IC-manufacturers over the world that use various exposure systems and
work with very high requirements in order to establish and maintain stable lithographic
processes of 65 nm, 45 nm and below. Once the process is established, manufacturer
desires to be able to run it on different tools that are available. This is why the proximity
matching plays a key role to maximize tools utilization in terms of productivity for
different types of exposure tools.
In this paper, we investigate the source of errors that cause optical proximity mismatch
and evaluate several approaches for proximity matching of different types of 193 nm
and 248 nm scanner systems such as set-get sigma calibration, contrast adjustment, and,
finally, tuning imaging parameters by optimization with Manual Scanner Matcher.
First, to monitor the proximity mismatch, we collect CD measurement data for the
reference tool and for the tool-to-be-matched. Normally, the measurement is performed
for a set of line or space through pitch structures.
Secondly, by simulation or experiment, we determine the sensitivity of the critical
structures with respect to small adjustment of exposure settings such as NA, sigma
inner, sigma outer, dose, focus scan range etc. that are called 'proximity tuning knobs'.
Then, with the help of special optimization software, we compute the proximity knob
adjustment that has to be applied to the tool-to-be-matched to match the reference tool.
Finally, we verify successful matching by exposing on the tool-to-be-matched with
tuned exposure settings.
This procedure is applicable for inter- and intra scanner type matching, but possibly
also for process transfers to the design targets.
In order to illustrate the approach we show experimental data as well as results of
imaging simulations. The set demonstrate successful matching of critical structures for
ArF scanners of different tool generations.
Scanner matching optimization
Show abstract
Cost of ownership of scanners for the manufacturing of front end layers is becoming increasingly expensive. The ability
to quickly switch the production of a layer to another scanner in case it is down is important. This paper presents a
method to match the scanner grids in the most optimal manner so that use of front end scanners in effect becomes
interchangeable. A breakdown of the various components of overlay is given and we discuss methods to optimize the
matching strategy in the fab. A concern here is how to separate the scanner and process induced effects. We look at the
relative contributions of intrafield and interfield errors caused by the scanner and the process. Experimental results of a
method to control the scanner grid are presented and discussed. We compare the overlay results before and after
optimizing the scanner grids and show that the matching penalty is reduced by 20%. We conclude with some thoughts
on the need to correct the remaining matching errors.
Comparative study of process window identification methods for 45 nm device and beyond
Show abstract
Lithography process control becomes increasingly challenging as the design rules shrink. To tackle the issue of
identifying the process window for lithography, we systematically compared three different approaches for 45 nm
process wafer with two variables: Inspection mode (FEM or PWQ) and Analysis methodology (Manual or Design Based
Binning). We found that PWQ + DBB provided the best results.
Improve scanner matching using automated real-time feedback control via scanner match maker (SMM)
Shian-Huan (Cooper) Chiu,
Sheng-Hsiung Yu,
Min-Hin Tung,
et al.
Show abstract
Traditional "matching matrix" methods for characterizing scanner matching assume that the scanner distortion
performance is static. The latest scanner models can adjust the distortion performance dynamically, at run-time. The
Scanner Match Maker (SMM) system facilitates calculation and application of these run-time adjustments, improving
effective overlay performance of the scanner fleet, allowing more flexibility for mix-and-match exposure. The overlay
|mean|+3s performance was improved significantly for a layer pair that is currently allowed mix-and-match pairing.
Hole inspection technology using Fourier imaging method
Show abstract
There are two kinds of critical dimension (CD) management tools; CD-SEM and Optical CD (OCD). OCD is
preferable to other existing measurement tools, because of its higher throughput and lower photoresist damage. We have
developed an Automated Pattern profile Management (APM) systems based on the OCD concept. For the monitoring
thin line, APM detects light intensity from an optical system consisting of a polarizer and an analyzer set in a cross-
Nicol configuration as a polarization fluctuation. This paper reports our development of monitoring technology for hole.
In the case of hole management, APM detects light intensity from diffraction intensity fluctuation. First of all, the best
conditions for hole management were designed from simulations. The best conditions were off-axis aperture and S
polarizer. In our evaluation of wafers without underlayer, we obtained a good correlation with CD-SEM value. From the
simulation, we consider the APM system to be very effective for shrinking hole process management of the next
generation from the simulation.
Investigation of factors causing difference between simulation and real SEM image
Show abstract
As the candidates of factors to consider for accurate Monte Carlo simulation of SEM images, (1) the difference
of cross-section between an approximate shape for simple simulation and a real pattern shape, (2) the influence of native
oxide growing on a pattern surface, and (3) the potential distribution above the target surface are proposed. Each
influence on SEM signal is studied by means of experiments and simulations for a Si trench pattern as a motif. Among
these factors, native oxide of about 1nm in thickness has a significant influence that increases SEM signals at the top
edge and the slope. We have assumed and discussed models for the native oxide effect.
Development and implementation of PWQ on patterned wafer darkfield inspection systems
Uwe Streller,
Kay Wendt,
Arno Wehner,
et al.
Show abstract
Process Window Qualification (PWQ) is a well-established wafer inspection technique used to qualify the design of
mask sets and to characterize lithography process windows. While PWQ typically employs a broadband brightfield
inspector, novel techniques for patterned wafer darkfield inspection have proven to provide sufficient sensitivity along
with noise suppression benefits for lithography layers. This paper describes the introduction and implementation of
PWQ on patterned wafer darkfield inspectors. An initial project characterized critical PWQ requirements on the
darkfield inspector. The results showed that this new approach meets performance requirements, such as defect of
interest (DOI) detection and process window characterization, as well as ease-of-use requirements such as automated
setup for advanced design rule products.
Development of a novel methodology for effective partial die inspection and monitoring
Show abstract
Defect inspection is a challenge in the edge of wafer region and several new inspection tools and techniques have come
to the market to fulfill this inspection need. Current inspection methodology excludes inspection of partial die located at
the wafer edge, which has lead to the development of a technique available for patterned wafer inspection tools to inspect
these partially printed die. In this paper we identify and develop a robust methodology for the characterization and
monitoring of defectivity on the partially printed edge die. The methodology includes the development of methods for
inspection optimisation requirements, characterization and isolation of defect sources, optimisation of clustering and
binning and control of partial die defectivity.
Evaluation of a new photoresist dispense system to detect coating variation
Show abstract
A minimal change of dispensed volume will have a severe impact on the film thickness uniformity and in the worst case
there might be some lack of resist on the wafer. Therefore it is essential to set-up the photoresist dispense accurately to
avoid any dispense variation. In addition, it is important to monitor the dispense conditions real-time to detect problems
which may have a direct negative impact on process yield.
This paper presents the evaluation of the IntelliGen® Mini dispense system which is manufactured by Entegris, Inc. This
new system is able to detect variations like bubbles in the dispense line, changes to the stop suckback valve, and changes
in viscosity1. After an explanation of the pump characteristics and the potential root causes of dispense variation and
their consequences, the evaluation done in Altis Semiconductor will be presented.
The study has been made utilizing different photo-chemicals, including low and mid-range viscosity photo- resists and
anti-reflective coatings. The capability of this new product to detect any perturbation of coating will be demonstrated.
Then standard tests like coating repeatability, defect density CD uniformity and finally wafer yield inspection will be
performed to prove efficiency of the system in a production mode.
A study on effect of point-of-use filters on defect reduction for advanced 193nm processes
Show abstract
Bottom Anti-Reflective Coatings (BARCs) have been widely used in the lithography process for decades. BARCs play
important roles in controlling reflections and therefore improving swing ratios, CD variations, reflective notching, and
standing waves. The implementation of BARC processes in 193nm dry and immersion lithography has been
accompanied by defect reduction challenges on fine patterns. Point-of-Use filters are well known among the most
critical components on a track tool ensuring low wafer defects by providing particle-free coatings on wafers. The filters
must have very good particle retention to remove defect-causing particulate and gels while not altering the delicate
chemical formulation of photochemical materials.
This paper describes a comparative study of the efficiency and performance of various Point-of-Use filters in reducing
defects observed in BARC materials. Multiple filter types with a variety of pore sizes, membrane materials, and filter
designs were installed on an Entegris Intelligent(R) Mini dispense pump which is integrated in the coating module of a
clean track. An AZ(R) 193nm organic BARC material was spin-coated on wafers through various filter media.
Lithographic performance of filtered BARCs was examined and wafer defect analysis was performed. By this study, the
effect of filter properties on BARC process related defects can be learned and optimum filter media and design can be
selected for BARC material to yield the lowest defects on a coated wafer.
Automated defect review of the wafer bevel with a defect review scanning electron microscope
Show abstract
One of the few remaining bastions of non-regulated Integrated Circuit defectivity is the wafer bevel. Recent internal
Integrated Circuit Manufacturing studies have suggested that the edge bevel may be responsible for as much as a two to
three percent yield loss during a defect excursion on the manufacturing line and a one to two percent yield loss during
ongoing wafer manufacturing.
A new generation of defect inspection equipment has been introduced to the Research and Development, Integrated
Circuit, MEM's and Si wafer manufacturing markets that has imparted the ability for the end equipment user to detect
defects located on the bevel of the wafer.
The inherent weakness of the current batch of wafer bevel inspection equipment is the lack of automatic discrete defect
classification data into multiple, significant classification bins and the lack of discrete elemental analysis data. Root
cause analysis is based on minimal discrete defect analysis as a surrogate for a statistically valid sampling of defects
from the bevel.
This paper provides a study of the methods employed with a Hitachi RS-5500EQEQ Defect Review Scanning Electron
Microscope (DRSEM) to automatically capture high resolution/high magnification images and collect elemental analysis
on a statistically valid sample of the discrete defects that were located by a bevel inspection system.
Results from prototype die-to-database reticle inspection system
Show abstract
A prototype die-to-database high-resolution reticle defect inspection system has been developed for 32nm and below
logic reticles, and 4X Half Pitch (HP) production and 3X HP development memory reticles. These nodes will use
predominantly 193nm immersion lithography (with some layers double patterned), although EUV may also be used.
Many different reticle types may be used for these generations including: binary (COG, EAPSM), simple tritone,
complex tritone, high transmission, dark field alternating (APSM), mask enhancer, CPL, and EUV. Finally, aggressive
model based OPC is typically used, which includes many small structures such as jogs, serifs, and SRAF (sub-resolution
assist features), accompanied by very small gaps between adjacent structures. The architecture and performance of the
prototype inspection system is described. This system is designed to inspect the aforementioned reticle types in die-todatabase
mode. Die-to-database inspection results are shown on standard programmed defect test reticles, as well as
advanced 32nm logic, and 4X HP and 3X HP memory reticles from industry sources. Direct comparisons with currentgeneration
inspection systems show measurable sensitivity improvement and a reduction in false detections.
Automated reticle inspection data analysis for wafer fabs
Show abstract
To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection
system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced
operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or
fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer
defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer
basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a
very tedious and time-consuming task and may cause extended manufacturing line-down situations.
Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports
to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation
errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be
spent working on other more productive activities.
This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a
format compatible with KLA-Tencor's Klarity DefectTM data analysis database. The objective is to use the graphical
charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or
entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle
defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing
reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.
Inspection and metrology tools benefit from free-form refractive micro-lens and micro-lens arrays
Show abstract
LIMO's unique production technology based on computer-aided design enables the manufacture of high precision
asphere single lenses and arrays, where every single lens can be individually shaped. These free form micro-optical
cylindrical lens and lens arrays find their application in various types of metrology systems. Due to the high precise
manufacturing of specially designed surface, single lenses can be bond directly onto sensor or sensor arrays, performing
efficient projection of signal onto detector. Optical modules based on micro-lenses arrays enable special intensity
distribution, as well as highly homogeneous illumination with inhomogeneity less then 1% (peak to valley) used in
illumination parts of inspection tools. Due to the special free form profile, a special case of asymmetric lens arrays can
offer extreme uniformity illumination at the target non orthogonal to the illumination path. The feature under inspection
can be uniformly illuminated even if it lies at a specific angle to the illumination. This allows better conditions for
measurement devices arranged orthogonal to the mask or wafer. Furthermore the use of micro-optics enables more
sufficient inspection of laser beam parameters for excimer or
CO2 lasers. Additionally very accurate metal patterns can
be applied on the optics and used as alignment marks, apertures or bonding features.
A scatterometry based CD metrology solution for advanced nodes, including capability of handling birefringent layers with uniaxial anisotropy
Show abstract
A brand new CD metrology technique that can address the need for accuracy, precision and speed in near future
lithography is probably one of the most challenging items. CDSEMs have served this need for a long time,
however, a change of or an addition to this traditional approach is inevitable as the increase in the need for better
precision (tight CDU budget) and speed (driven by the demand for increase in sampling) continues to drive the
need for advanced nodes.
The success of CD measurement with scatterometry remains in the capability to model the resist grating, such as,
CD and shape (side wall angle), as well as the under-lying layers (thickness and material property). Things are
relatively easier for the cases with isotropic under-lying layers (that consists of single refractive or absorption
indices). However, a real challenge to such a technique becomes evident when one or more of the under-lying
layers are anisotropic.
In this technical presentation the authors would like to evaluate such CD reconstruction technology, a new
scatterometry based platform under development at ASML, which can handle bi-refringent non-patterned layers
with uniaxial anisotropy in the underlying stack. In the RCWA code for the bi-refringent case, the elegant
formalism of the enhanced transmittance matrix can still be used. In this paper, measurement methods and data
will be discussed from several complex production stacks (layers). With inclusion of the bi-refringent modeling,
the in-plane and perpendicular n and k values can be treated as floating parameters for the bi-refringent layer, so
that very robust CD-reconstruction is achieved with low reconstruction residuals. As a function of position over
the wafer, significant variations of the perpendicular n and k values are observed, with a typical radial fingerprint
on the wafer, whereas the variations in the in-plane n and k values are seen to be considerably lower.
An inverse ellipsometric problem for thin film characterization: comparison of different optimization methods
Show abstract
In this paper, an ill-posed inverse ellipsometric problem for thin film characterization is studied. The aim is to determine the thickness, the refractive index and the coefficient of extinction of homogeneous films deposited on a substrate without assuming any a priori knowledge of the dispersion law. Different methods are implemented for the benchmark. The first method considers the spectroscopic ellipsometer as an addition of single wavelength ellipsometers coupled only via the film thickness. The second is an improvement of the first one and uses Tikhonov regularization in order to smooth out the parameter curve. Cross-validation technique is used to determine the best regularization coefficient. The third method consists in a library searching. The aim is to choose the best combination of parameters inside a pre-computed library. In order to be more accurate, we also used multi-angle and multi-thickness measurements combined with the Tikhonov regularization method. This complementary approach is also part of the benchmark. The same polymer resist material is used as the thin film under test, with two different thicknesses and three angles of measurement. The paper discloses the results obtained with these different methods and provides elements for the choice of the most efficient strategy.
Analysis of Kohler illumination for 193 nm scatterfield microscope
Show abstract
A scatterfield microscope using 193 nm laser light was developed that utilizes angle-resolved illumination for high
resolution optical metrology. An angle scan module was implemented that scans the illumination beam in angle space at
the sample by linearly scanning a fiber aperture at a conjugate back focal plane. The illumination light is delivered
directly from a source laser via an optical fiber in order to achieve homogeneous angular illumination. A unique design
element is that the conjugate back focal plane (CBFP) is telecentric allowing the optical axis of the fiber to be scanned
linearly. Initial results from full field and angle-resolved illumination are presented and potential applications in
semiconductor metrology are described.
A new illumination technique for grating-based nanometer measurement applications
Show abstract
Optical gratings are becoming available with precision down to the 2 nm level, or below. Such gratings can be employed
to make highly accurate measuring tools such as optical encoders and coordinate measuring tools which can find
numerous applications in integrated circuit industry and elsewhere. Such tools are significantly less expensive than
interferometers because of relaxed mechanical tolerances required on the associated stages. However, the accuracy of
grating-based measurement tools is limited by optical imaging techniques for viewing the gratings. In this paper, a novel
illumination technique involving two parallel coherent beams is developed for grating-based measurement tools with
nanometer accuracy. The interference grating image was viewed under a microscope and the video image of the grating
was taken and processed. The location of the grating was determined with an error of 0.09 nm. The interference artifacts
generally present in laser illumination are eliminated under this new illumination technique. This provides a cleaner
starting point for data analysis and thus higher accuracy measurement. The techniques developed here provide levels of
accuracy limited only by the errors in the grating. This is comparable to accuracy of state of the art interferometers, but
at much lower cost.
SCATT: software to model scatterometry using the rigorous electromagnetic theory
Show abstract
Measurement of critical dimensions and vertical shape of features in semiconductor manufacturing is a critical
task. Optical scatterometry proved capable of providing such measurements. In this paper, a software tool to
model scatterometry was developed. Rigorous coupled wavelength analysis (RCWA) is used as a physical
model. This software does not use fitting coefficients of any specific equipment and therefore is useful in
understanding, analysis and optimization of measurements of specific patterns and potential sensitivity of
methods and systems to process variation. Special attention was given to improve the accuracy and
throughput of simulation; results of comparison to another software proved advantages of the developed
software.
Improved diffraction computation with a hybrid C-RCWA-method
Show abstract
The Rigorous Coupled Wave Approach (RCWA) is acknowledged as a well established diffraction simulation method in
electro-magnetic computing. Its two most essential applications in the semiconductor industry are in optical
scatterometry and optical lithography simulation. In scatterometry, it is the standard technique to simulate spectra or
diffraction responses for gratings to be characterized. In optical lithography simulation, it is an effective alternative to
supplement or even to replace the FDTD for the calculation of light diffraction from thick masks as well as from wafer
topographies. Unfortunately, the RCWA shows some serious disadvantages particularly for the modelling of grating
profiles with shallow slopes and multilayer stacks with many layers such as extreme UV masks with large number of
quarter wave layers. Here, the slicing may become a nightmare and also the computation costs may increase dramatically.
Moreover, the accuracy is suffering due to the inadequate staircase approximation of the slicing in conjunction with the
boundary conditions in TM polarization. On the other hand, the Chandezon Method (C-Method) solves all these
problems in a very elegant way, however, it fails for binary patterns or gratings with very steep profiles where the
RCWA works excellent. Therefore, we suggest a combination of both methods as plug-ins in the same scattering matrix
coupling frame. The improved performance and the advantages of this hybrid C-RCWA-Method over the individual
methods is shown with some relevant examples.
Multi-purpose optical profiler for characterization of materials, film stacks, and for absolute topography measurement
Show abstract
We have developed a scanning white-light interference microscope that offers two complementary modes of operation
on a common metrology platform. The first mode measures the topography and the second mode measures the complex
reflectivity of an object surface over a range of wavelengths, angles of incidence and polarization states. This second
mode characterizes material optical properties and determines film thickness in multi-layer film stacks with an effective
measurement spot size typically smaller than 10 μm. These data compensate for material and film effects in the surface
topography data collected in the first mode. We illustrate the application of this dual-mode technology for post-CMP
production-line metrology for the data storage industry. Our tool concurrently measures critical layer thickness and step
height for this application. The accuracy of the latter measurement is confirmed by correlation to AFM measurements.
Immersion scanner proximity matching using angle resolving scatterometry metrology
Show abstract
The fingerprint of optical proximity effect, OPE, is required to develop each process node's optical proximity correction
(OPC) model. The OPC model should work equally well on exposure systems of the type on which the model was
developed and of different type. Small differences in optical and mechanical scanner properties can lead to a different
CD characteristic for a given OPC model. It becomes beneficial to match the OPE of one scanner to the scanner
population in a fab. Here, we report on a matching technique based on measured features in resist employing either CDSEM
or scatterometry. We show that angle resolving scatterometry allows improving the metrology throughput and
repeatability. The sensitivity of the CD as a function of the scanner adjustments and the effect of scanner tuning can be
described more precisely by scatterometry using an identical number of printed features for measurement. In our
example the RMS deviation between the measured and the predicted tuning effect of scatterometry is 0.2 nm compared
to 0.8 nm of CD-SEM allowing to set tighter matching targets.
Integrated ODP metrology with floating n&k's for lithography process
Show abstract
Advanced DRAM manufacturing demands rigorous and tight process control using high measurement precision,
accurate, traceable and high throughput metrology solutions. Scatterometry is one of the advanced metrology techniques
which satisfies all the above mentioned requirements and it has been implemented in semiconductor manufacturing for
some time for monitoring and controlling critical dimensions and other important structural parameters. One of the
major contributions to the optical critical dimensions metrology uncertainty is the variations in optical properties
(n&k's) of film stack materials. And it is well-known that the optical properties of materials depend very much on
process conditions (such as operating conditions of deposition tools). However, in traditional scatterometry approach all
the n&k's have been used as fixed inputs in a scatterometry model which might result in significant metrology error.
This paper shows the use of the integrated scatterometry system in a real production environment. The significant
improvement in accuracy of CD data was achieved following the implementation of new floating n&k's option for the
Optical Digital Profilometry (ODPTM) system. It has been clearly shown that to achieve desired sub-nanometer accuracy
in scatterometry measurements for advanced processes we need to pay scrupulous attention to every detail of the
scatterometry modeling and measurement. Still further work is needed to better understand the impact of n&k's
variations on tool-to-tool matching.
Restoring pattern CD and cross-section using scatterometry: various approaches
Show abstract
Scatterometry is one of the major methods used to measure linewidths and profiles of fabricated patterns.
In scatterometry, light reflected from or transmitted through a periodic pattern is measured. The spectral
signature of the pattern is compared to a library of signatures obtained using simulations; the crossection
profile and linewidth are determined. The comparison method should be accurate and fast. On the other
hand, the method should be able to work with relatively small libraries to shorten the creation of the library for
specific types of patterns. This paper describes the evaluation of various approaches to such solutions. A
direct comparison, a neural network, and a polynomial method were used. It was found that all these methods
can provide good accuracy; while neural network and polynomial methods require considerably smaller
libraries for stable and accurate extractions of CDs and the profile.
Process optimization for optical CD correlation improvement of ADI and AEI 3D structure by using iODP
Show abstract
To fully characterize the lithography process, it is critical to have accurate CD and profile of photo
resist structure at ADI stage. Traditionally, CDSEM can only provide limited profile information, and
is extremely challenged to be integrated for real-time in line wafer level process control because of
throughput issue. Over the past few years, optical digital profilometry (ODP(R)) developed by Timbre
Technologies, Inc., has been adopted for real-time process control in Litho for optical CD and shape
monitoring. In this paper, the integrated ODP(R) reflectometer is applied to study process signatures of
3D complicated ADI and AEI structures of a 70nm DRAM process. The DT structures from the 70nm
node process studied in this paper, are elliptical photo resist via developed over a thin film stack at
ADI, and via etched deeply through the thick (over 5um) dielectric film at AEI. At ADI, the ODP(R)
library is qualified by careful cross check with CDSEM data. CD results from iODP(R) show very good
correlation to that from CDSEM. The iODP® measurement for a FEM wafer shows smoother and
cleaner Bossung curves than the CDSEM does. At AEI, the library is then qualified for top CD
measurement in comparison to CDSEM, and also to results at ADI. With implementing iODP(R)
measurements into both ADI and AEI structures, their signature patterns from ADI to AEI for 70nm
DT process can be matched successfully. Such a signature pattern match indicates the strong
correlation between ADI and AEI processes and can be fully made use of for APC. It is a significant
step toward IM APC control considering ADI and AEI process steps together.
Using scatterometry to improve process control during the spacer pitch splitting process
Show abstract
In an effort to keep scaling at the speed of Moore's law, novel methods are being developed to facilitate advanced
semiconductor manufacturing at the 32nm node and beyond. One such method for enabling the creation of dense pitches
beyond the current lithography resolution limit is spacer pitch splitting. This method typically involves patterning a
sacrificial gate pattern, then performing a standard spacer deposition and etch back process, after which the sacrificial gate
is removed and the remaining spacers themselves are used as the effective mask for the pattern transfer. Some of the key
advantages of this process are the ability to create sub-resolution lines and also the improvement in Line Edge Roughness
seen on the final pattern. However, there are certain limitations with this process, namely the ability to only pattern lines in
one dimension, and also the complexity of the metrology, where the final Critical Dimension result is a function of the
lithography condition from the sacrificial gate patterning, and also the various film layer depositions as well as the spacer
etch back process. Given this complexity, the accurate measurement of not only the spacer width but also the spacer shape
is important. In this work we investigate the use of scatterometry techniques to enable these measurements on leading edge
devices.
Time dependence of SEM signal due to charging: measurements and simulation using Monte Carlo software
Show abstract
CD-SEM measurement is the main measuring tool of critical dimensions (CD). CD-measurements involve
systematic errors that depend on SEM set-up and the pattern. In addition to systematic errors, charging of a
wafer plays an important role in CD-SEM and defect inspection tools. Charging dependence of secondary
electron emission coefficient which is one of the major charging parameters, was studied. Timing
characteristics were measured and then simulated using Monte Carlo model. The measurements and
simulations were done for a multiple number of frames and for imaging of a contact hole using pre-charge of
a large area. The results of simulation confirmed the measured results. The understanding of the effect helps
in tuning the settings of CD-SEM.
Arbitrary precision value overlay and alignment system by double positioning of mask and wafer and electronic datum and nano sensor (notice of removal)
Show abstract
This paper (SPIE Paper 727249) was removed from the SPIE Digital Library on 30 April 2009 upon learning that the two names associated with this publication record, Xiang-Wen Xiong and Wynn L. Bear, are actually the same individual and not two different authors. This is not sanctioned by SPIE. As stated in the SPIE Guidelines for Professional Conduct and Publishing Ethics, "SPIE considers it the professional responsibility of all authors to ensure that the authorship of submitted papers properly reflects the contributions and consent of all authors." A serious violation of these guidelines is evident in this case. It is SPIE policy to remove papers from the SPIE Digital Library where serious professional misconduct has occurred and to impose additional sanctions as appropriate.
Noise-free estimation of spatial line edge/width roughness parameters
Show abstract
At present, the most widely used technique for Line Edge and Width Roughness (LER/LWR) measurement is based on
the analysis of top-down CD-SEM images. However, the presence of noise on these affects importantly the obtained
edge morphologies leading to biased LER/LWR measurements. In the last few years, significant progress has been made
towards the acquisition of noise-free LER/LWR metrics. The output of all proposed methods is the noise-free rms value
Rq estimated using lines with sufficiently long lengths. Nevertheless, one of the recent advances in LER/LWR metrology
is the realization that a single Rq value does not provide a complete description of LER and a three parameter model has
been introduced including the Rq value at infinite line length, the correlation length ξ and the roughness exponent α. The
latter two parameters describe the spatial fluctuations of edge morphology and can be calculated by the height height
correlation function G(r) or the dependence of rms value Rq on line length Rq(L). In this paper, a methodology for noise
free estimation of G(r) and Rq(L) is proposed. Following Villarrubia et al. [Proc.SPIE5752, 480 (2005)], we obtain a
formula for the noise free estimation of G(r) and assess its predictions by implementing and applying a modeling
approach. Also, we extend appropriately the methodology of Yamagutchi et al. [Proc.SPIE6152, 61522D (2006)] to a
large range of line lengths and show that it can provide a reliable noise free estimation of the whole Rq(L) curve.
Comparison of physical gate-CD with in-die at-speed non-contact measurements for bin-yield and process optimization
J. S. Vickers,
J. Galvier,
W. Doedel,
et al.
Show abstract
We report on a performance-based measurement (PBM) technique from a volume production 65-nm multi-product wafer
(MPW) process that shows far more sensitivity than the standard physical gate-length (CD) measurements. The
performance (the electrical "effective" gate length, Leff) variation results measured by PBM can NOT be explained alone
by CD (physical gate) measurement and show that the non-destructive (non-contact) PBM is able to monitor and control
at first-level of electrical connectivity (≥ M1), the bin-yield determining in-die variation that are NOT captured or
realized by physical CD measurement. Along with this higher sensitivity, we also show that the process-induced
variation (excursion) has a distinct signature versus "nominal" expected behavior.
Implementation of multiple ROI with single FOV for advanced mask metrology
Show abstract
As technology nodes go down to 45nm and below, mask metrology becomes more important as the critical features
decrease in size, while, at the same time, the number of measurements that need to be performed increases. OPC and
RET put further burden on metrology as it is typical to measure more than one dimension on a single feature. In order to
maximize the throughput of metrology tools and to keep up with the demand for more measurements, we have
implemented the ability to measure multiple CD sites within a field of view without any stage movement in fully
automated ways in a production environment. This in turn reduces total mask measurement time and helps to increase
tool capacity