Proceedings Volume 5378

Data Analysis and Modeling for Process Control

cover
Proceedings Volume 5378

Data Analysis and Modeling for Process Control

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 29 April 2004
Contents: 9 Sessions, 26 Papers, 0 Presentations
Conference: Microlithography 2004 2004
Volume Number: 5378

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Keynote Paper
  • Advanced Process Control I
  • Data Modeling for Control I
  • CD Uniformity Control: Joint Session with Conf. 5875
  • Advanced Process Control II: Joint Session with Conf. 5375
  • Methods for Data Analysis and Automation
  • Advanced Process Control II
  • Data Modeling for Control II
  • Data Modeling for Control I
  • Poster Session
Keynote Paper
icon_mobile_dropdown
Intel nanotechnology integrated process control systems: an overview
As patterning dimensions decrease, die yield and performance become increasingly sensitive to smaller amounts of process variations. To minimize variability, Process Control is applied to prevent excursions, improve yield, decrease non-product runs, reduce cycle time due to rework, and reduce equipment calibration and maintenance. Intel inline Process Control aims at rapid detection, classification, prediction, and correction of problems and/or non-optimal performance during wafer processing. For efficient process control, robust analysis is needed in order to monitor the process, detect, and predict the process behavior. The paper will address Intel model based control and will focus on the various model based analysis and control modules that Intel has developed, and deployed for different technology generations. With the rapid increases in the number of analysis and control modules and the emerging need for integrating such modules to allow sharing of data, applications and methods, there is a need to define standard interfaces for such modules. This need motivated Intel to lead the development of SEMI E133; the Process Control Systems (PCS) Standard that was approved on October 2003.
Advanced Process Control I
icon_mobile_dropdown
In-tool process control for advanced patterning based on integrated metrology
David S. L. Mui, Hiroki Sasano, Wei Liu, et al.
Control on the order of a nanometer is crucial for present days advanced logic, SRAM, and DRAM integrated circuits (IC). This level of control is necessary to ensure proper functioning of these circuits. In logic and SRAM applications the most important control parameter is the critical dimension of the gate conductor and for DRAM deep trench it is the etch depth. Advanced Process Control (APC) using feedforward and feedback closed loop techniques have been implemented in fabs for over two decades. Up until recently most fabs have used standalone metrology tools exclusively to collect critical wafer parameters. In this paper, a fully integrated TransformaTM Closed Loop (CL) etch system is used to facilitate nanometer gate etch control by enabling, for the first time, real-time feedforward and feedback measurement on a gate etch process.
Multivariable versus univariable APC
Kamyar Faron, Mark Freeland, Ole Krogh, et al.
The need for process control in the semiconductor industry has been established and the benefits have been demonstrated. In the past, process control applications in the semiconductor manufacturing have relied on univariable control schemes. In the case of poly gate etch control methods have manipulated trim time or O2 flow to maintain a target poly gate FI CD target as the input DI CD varies. This study focuses on the impact of process control on secondary output goals and compares the effect of univariable control with that of multivariable control. There are two important conclusions for semiconductor manufacturers: (i) univariable control schemes sacrifice secondary control goals when trying to achieve the primary control goal and (ii) manufacturers must adopt multi-input multi-output control schemes to actively control both secondary process goals and primary control goals.
Advanced process control applied to metal layer overlay process
Overlay control of printed layers onto processed metal levels highly depends on the process influences to the alignment and overlay measurement targets. This study characterizes the systematic influence to both alignment and overlay metrology based on AlCu deposition and W CMP. The systematic influence of AlCu deposition to alignment and overlay targets are explored in theory and then verified experimentally. Both theory and experimental results are then validated empirically as “non-zero” overlay control is applied to high volume production to increase wafer edge yield. The influence of W CMP on lot-to-lot overlay performance is also characterized and accounted for to further improve metal one overlay performance. The emphasis of this paper will be model-based Advanced Process Control (APC). Methods used to characterize process influence on overlay for a back-end metal process will be discussed. We will then describe how predictive overlay was modeled in terms of AlCu deposition target life, W CMP endpoint as well as normal control context such as exposure tool and part id. The challenge of implementing APC for increased context partitioning is discussed and the need for a model-based approach is stressed. The methodology used for lot disposition in the case of “non-zero” targeted overlay is also explained.
Data Modeling for Control I
icon_mobile_dropdown
Modeling for profile-based process-window metrology
Christopher P. Ausschnitt, Shaunee Y. Cheng
We formulate a physical model to extract effective dose and defocus (EDD) from pattern profile data and demonstrate its efficacy in the analysis of in-line scatterometer measurements. From the measurement of a single target structure, the model enables simultaneous computation of pattern dimensions pre-calibrated to the imaging system dose and focus settings. Our approach is generally applicable to ensuring the adherence of pattern features to dimensional tolerances in the control and disposition of product wafers while minimizing in-line metrology.
Model-based fault detection and metrology error rejection in registration APC system
Ziqiang John Mao, Issi Geier
After we implemented the run-to-run feedback control system for lithographic stepper registration, we found that the metrology error introduced wrong control signals which drove the process away from normal operation. This paper presents methods of model-based fault detection and metrology error rejection. We use the fault detection method to monitor the health of run-to-run system and apply the error rejection method to proactively correct control signal to ensure the desired targets. Comparing to the old run-to-run system used in litho process that only provides warning limits and hard limits with fixed thresholds on individual physical variables, the proposed fault detection method is more sensitive to detect drift, shift and out-of-control points. It could have detected a real problem much quicker if this method was used. The error rejection method is very powerful to handle metrology errors as well as shift and drift by using the estimated output instead of the measured output. The experiments on the real data and simulation data validate the usefulness of the method.
Advanced module-based approach to effective CD prediction of sub-100nm patterns
In this article, an advanced module-based approach is introduced to simulate sub-100 nm patterns. Topography (TOPO), an in-house lithography simulator, consists of four basic modules: i) illumination, ii) mask, iii) imaging, and iv) resist. Since TOPO is module-based, it is convenient for user specific applications. The input parameter of illumination module is pupil intensity profile, which is measured using the transmission image sensor of ASML. In the mask kernel, mask corner rounding effect is considered while imaging module takes care of lens aberration and flare problems. Finally, the resist module uses Gaussian convolution model with the trade-off in mind between accuracy of full resist model and speed of Gaussian convolution model. As an application example, an iso-dense bias (ID bias) fitting is implemented for an ArF resist to image sub-100 nm patterns. Simulation results show that the fitting error meets the prediction accuracy target of International Technology Roadmap for Semiconductors 2002. The advanced module-based model using aerial image with measured pupil intensity profile and Gaussian convolution seems to be an effective way for the CD prediction of sub-100 nm patterns.
Propagation of APC models across product boundaries
Tito Chowdhury, Mark Freeland, Ole Krogh, et al.
The BCT solution is an automatic correction for systematic offset built into the PCS product based on calibration from 2-5 wafers. This paper explores the validity of a predictive model for process control for use by manufacturers of semiconductor devices with a multitude of products or part numbers. The proposed model defines the parameters of interests as a function of the film stack, tool attributes, and mask characteristics. The paper proposes a process for model development that dramatically reduces the cost of materials, tool time, and engineering effort.
CD Uniformity Control: Joint Session with Conf. 5875
icon_mobile_dropdown
Comparing the transient response of a resistive-type sensor with a thin film thermocouple during the post-exposure bake process
Kenneth G. Kreider, David P. DeWitt, Joel B. Fowler, et al.
Recent studies on dynamic temperature profiling and lithographic performance modeling of the post-exposure bake (PEB) process have demonstrated that the rate of heating and cooling may have an important influence on resist lithographic response. Measuring the transient surface temperature during the heating or cooling process with such accuracy can only be assured if the sensors embedded in or attached to the test wafer do not affect the temperature distribution in the bare wafer. In this paper we report on an experimental and analytical study to compare the transient response of embedded platinum resistance thermometer (PRT) sensors with surface-deposited, thin-film thermocouples (TFTC). The TFTCs on silicon wafers have been developed at NIST to measure wafer temperatures in other semiconductor thermal processes. Experiments are performed on a test bed built from a commercial, fab-qualified module with hot and chill plates using wafers that have been instrumented with calibrated type-E (NiCr/CuNi) TFTCs and commercial PRTs. Time constants were determined from an energy-balance analysis fitting the temperature-time derivative to the wafer temperature during the heating and cooling processes. The time constants for instrumented wafers ranged from 4.6 s to 5.1 s on heating for both the TFTC and PRT sensors, with an average difference less than 0.1 s between the TFTCs and PRTs and slightly greater differences on cooling.
Intra-wafer CDU characterization to determine process and focus contributions based on scatterometry metrology
Mircea Dusa, Richard Moerman, Bhanwar Singh, et al.
Current advanced lithography processes are based on a Critical Dimension (CD) budget of 10nm or less with errors caused by exposure tool, wafer substrate, wafer process, and reticle. As such, allowable CD variation across wafer becomes an important parameter to understand, control and minimize. Three sources of errors have an effect on CD Uniformity (CDU) budget, run-to-run (R2R), wafer-to-wafer (W2W) and intra-wafer. While R2R and W2W components are characterized and compensation conrol techniques were developed to minimize their contribution the intra-wafer component is more or less ignored with the consequence that its sources of errors have not been characterized and no compensation technique is available. In this paper, we propose an approach to analyze intra-wafer CD sources of variations identifying the non-random CDU behavior and connect this with disturbances caused by processing errors described by their wafer spatial coordinates. We defined a process error as disturbance and its effect as a feature response. We study the impact of modeling spatial distribution of a feature response as calculated by diffractive optical CD metrology (scatterometry) and relate it to a programmed process disturbance. Process disturbances are classified in terms of time characteristics that define their spatial distribution. We demonstrated feature response to a disturbance behavior as statistical values as well as spatial profile. We identified that CD response is not sufficient to determine the sources of process disturbance and accordingly added responses from other features, which add to detection of CDU sources of error. The added respsonses came from scatterometry principle based on model difinition of a litho patter described by its shape with characteristic features: bottom CD, resist thickness, sidewall angle and bottom antireflective layer thickness. Our results show that process errors with continuous intra-wafer variation, such as PEB and BARC thickness have larger effects on CDU compared to process errors with discrete intra-wafer behavior, such as dose and defocus. Correlation between multiple feature responses to process disturbance was characterized as spatial covariance between CD to resist thickness and CD to SWA. Spatial feature covariance enhances capability to infer sources of process disturbance from metrology data.
Advanced Process Control II: Joint Session with Conf. 5375
icon_mobile_dropdown
In-line lithography cluster monitoring and control using integrated scatterometry
Ivan Pollentier, Shaunee Y. Cheng, Bart Baudemprez, et al.
In the continuous drive for smaller feature sizes, process monitoring becomes increasingly important to compensate for the smaller lithography process window and to assure that Critical Dimensions (CD) remain within the required specifications. Moreover, the higher level of automation in manufacturing enables almost real-time correction of lithography cluster machine parameters, resulting in a more efficient and controlled use of the tools. Therefore, fast and precise in-line lithography metrology using Advanced Process Control (APC) rules are becoming crucial, in order to guarantee that critical dimensions stay correctly targeted. In this paper, the feasibility of improving the CD control of a 193nm lithography cluster has been investigated by using integrated scatterometry. The target of the work was to identify if a dose correction on field and wafer level, based on precise in-line measurements, could improve the overall CD control. Firstly, the integrated metrology has been evaluated extensively towards precision and sensitivity in order to prove its benefits for this kind of control. Having a long-term repeatability of significantly better than 0.75nm 3σ, this was very promising towards the requirements for sub-nanometer CD correction. Moreover, based on an extensive evaluation of the process window on the lithography cluster, it has been shown that the focus variation is minimal and that CD control can be improved using dose correction only. In addition, systematic variations in across-wafer uniformity and across-lot uniformity have been determined during this monitoring period, in order to identify correctable fingerprints. Finally, the dose correction model has been applied to compensate for these systematic CD variations and improved CD control was demonstrated. Using a simple dose correction rule, a forty percent improvement in CD control was obtained.
Complementary feed-forward and feedback method for improved critical dimension control
Igor Jekauc, Christopher J. Gould, Walter Hartner, et al.
In modern day semiconductor manufacturing, control of patterned line widths is an especially important task as even the smallest deviation from desired critical dimension (CD) target could result in undesirable electrical results. Traditional methods of CD control have included a feedback component where dose is adjusted through time, based on the measured critical dimensions of previous lots. Depending on process setup, stack influence on patterned features can often be diminished by introduction of organic anti-reflection coating (ARC) prior to the application of photoresist. Unfortunately even with an ARC layer, due to extreme topography and film stack, CD influences may be pronounced. Most often any such CD influence is exhibited as a Lot-2-Lot (L2L) component of variation and to a lesser degree as a Wafer-2-Wafer (W2W) component. A simple feedback system can be adjusted to encompass a larger number of lots for dose recommendations thus making certain to include closer to the entire population of stack variation. An improved control system is one in which this feedback component is supplemented by a feed-forward component where a certain stack predictor is used in providing a specific recommendation for a lot. Stack information for a lithographic layer used in DRAM manufacturing will be presented alongside with a relationship existing between critical dimensions of features patterned in photoresist and top film thickness. Significant economic rework cost reduction and improvement in CD control with a two-month implementation of a complementary feed-forward and feedback system will be compared against performance with the feedback only system.
Methods for Data Analysis and Automation
icon_mobile_dropdown
Automatic defect classification using topography map from SEM photometric stereo
As the industry moves to smaller design rules, shrinking process windows and shorter product lifecycles, the need for enhanced yield management methodology is increasing. Defect classification is required for identification and isolation of yield loss sources. Practice demonstrates that an operator relies on 3D information heavily while classifying defects. Therefore, Defect Topographic Map (DTM) information can enhance Automatic Defect Classification (ADC) capabilities dramatically. In the present article, we describe the manner in which reliable and rapid SEM measurements of defect topography characteristics increase the classifier ability to achieve fast identification of the exact process step at which a given defect was introduced. Special multiple perspective SEM imaging allows efficient application of the photometric stereo methods. Physical properties of a defect can be derived from the 3D by using straightforward computer vision algorithms. We will show several examples, from both production fabs and R&D lines, of instances where the depth map is essential in correctly partitioning the defects, thus reducing time to source and overall fab expenses due to defect excursions.
Automated fault detection and classification of etch systems using modular neural networks
Sang Jeen Hong, Gary Stephen May, John Yamartino, et al.
Modular neural networks (MNNs) are investigated as a tool for modeling process behavior and fault detection and classification (FDC) using tool data in plasma etching. Principal component analysis (PCA) is initially employed to reduce the dimensionality of the voluminous multivariate tool data and to establish relationships between the acquired data and the process state. MNNs are subsequently used to identify anomalous process behavior. A gradient-based fuzzy C-means clustering algorithm is implemented to enhance MNN performance. MNNs for eleven individual steps of etch runs are trained with data acquired from baseline, control (acceptable), and perturbed (unacceptable) runs, and then tested with data not used for training. In the fault identification phase, a 0% of false alarm rate for the control runs is achieved.
Integrated electrical and SEM-based defect characterization for rapid yield ramp
Jacob Orbon, Lior Levin, Ofer Bokobza, et al.
Challenges of the new nanometer processes have complicated the yield enhancement process. The systematic yield loss component is increasing, due to the complexity and density of the new processes and the designs that are developed for them. High product yields can now only be achieved when process failure rates are on the order of a few parts per billion structures. Traditional yield ramping techniques cannot ramp yields to these levels and new methods are required. This paper presents a new systematic approach to yield loss pareto generation. The approach uses a sophisticated Design-of-Experiments (DOE) approach to characterize systematic and random yield loss mechanisms in the Back End Of the Line (BEOL). Sophisticated Characterization Vehicle (CV)TM test chips, fast electrical test and Automatic Defect Localization (ADL) are critical components of the method. Advanced statistical analysis and visualization of the detected and localized electrical defects provides a comprehensive view of the yield loss mechanisms. In situations where the defects are not visible in a SEM of the structure surface, automated FIB and imaging is used to characterize the defect. The combined approach provides the required resolution to appropriately characterize parts per billion failure rates.
PVD fault detection using disparate integrated data sources
In this paper, sensor data pertaining to plasma sputtering is examined. A sensor system monitoring power supply signals and detecting arcs was integrated with manufacturing equipment and data collected. In-situ measurements from the sensors correlate with post-process yield metrology, providing a mechanism for improved process control. Furthermore, from the rich set of data streaming from the sensors various classes of faults can be diagnosed. An error calculation and variable transformation methodology is presented so that classes of faults may be discriminated from one another.
CD error budget analysis in ArF lithography
As for CD (Critical Dimension) control, we classified factors of CD variations in each process. We quantified the factors occurred in the devices such as exposure tool, coater/developer and CD-SEM in 193nm lithography. In the coater/developer, influence of PEB (Post Exposure Bake) on CD variation was notably found and made up about 70% of the Track-related factors. This fact indicates that a great importance of PEB in 193nm process. Regarding the exposure tool, we quantified the CD variations caused by Flare using Kirk method. We determined that this issue was influenced by the exposure field layout, and the variation of intra wafer was 1.58nm. As for a CD-SEM, we measured the CD variations caused by the electron beam-induced CD shrink, and LWR (Line Width Roughness). The LWR accounts for about 40% of the total measurement errors, and affects CD variations higher as finer line pattern. We reduced influence of LWR on CD variations by extending measurement points and averaging. Thus we acquired the CD uniformity close to the actual CD.
Advanced Process Control II
icon_mobile_dropdown
Improvement of 90nm KrF Cu process window by minimizing via deformation caused by low-frequency resonance of scanner projection lens
There are many works on extension of KrF lithography for 90 nm logic generation, especially for those back end of line (BEOL) layers. High cost and immaturity of ArF tools and photoresists are the major factors that make IC manufacturers try to seek for the possibility of KrF lithography. For mass production of 130 nm node, KrF lithography has been pushed hard to achieve 160 nm contact holes with 320 nm pitch. However, with pushing KrF lithography further, printing of 140 nm via holes with the minimum pitch of 280 nm was required by the tight 90 nm design rules. Optimizing illumination settings is one way to obtain reasonable process windows through all the pitches for mass production of 90 nm node logic devices, and maintaining exposure tools in good conditions is the other. The control of pattern deformation becomes more and more significant when the critical dimension is drove to the limit. In this paper, oval shaped via holes were found for symmetrical pitch patterns. Lens aberration and synchronization errors of scanners are always the first considerations when pattern deformation happened. But after investigations, improvement of via pattern deformation control has been demonstrated by reducing the low frequency resonance of scanner projection lens. The via deformation is investigated in combination of different scanning and stepping speed of scanner stages, which will cause different amplitude of projection lens resonance. Low frequency region of projection lens resonance spectra showed less amplitude while scanning or stepping speed was slow. Pattern distortion was also reduced as amplitude of low frequency project lens resonance went low. Common process window was then improved due to the elimination of via cd difference between x and y direction. With this improvement, reasonable process window (DOF ~ 0.3 um) can be achieved for mass production of 90 nm devices on KrF lithography tools.
Use case approach to integrating and implementing lithography run-to-run control
Dorit Karlikar, Irit K. Abramovich, Miri Kish, et al.
The benefits of using a run-to-run control system for overlay and CD control have been well documented. However, before any these benefits can be achieved, one must first integrate the run-to-run control system into the existing automation and manufacturing execution system (MES) environment. Integration details that are overlooked during the planning stages often times create unnecessary challenges down the road that can delay reaching advantageous control results. INFICON has developed a novel methodology of documenting process and integration requirements. This method, termed Use Case Review, congregates the appropriate resources from the supplier and the customer to review and customize a predetermined set of documents that describe the run-to-run controller. Each use case contains a flow diagram and a detailed sequence of transactions documenting the actors (Automation PC, Process Equipment, MES, etc.) and variables (Lot ID, Process Level ID, Recipe ID, etc.) involved. The combined set of use cases covers all aspects of integrating a lithography run-to-run controller. During the implementation of NVS ARGUS, TOWER Semiconductor Ltd. benefited from use case review and customization.
Enhancement of photolithographic performance by implementing an advanced process control system
Traditional semiconductor manufacturing relies on a fixed process recipe combined with classic statistical process control to monitor the production process. Leading edge manufacturing processes continue to require increasingly stringent critical dimension and overlay control, which in turn demands innovative methods for process control. Meeting tighter process specifications, while maintaining productivity, dictates implementation of Advanced Process Control (APC) methods. An active control method exercised in APC enables the user to modify recipe variables in order to compensate for various disturbances such as drift or step changes in tool operation, or in the conditions of incoming product. The automated version of this control methodology is termed Run-to-Run (R2R) control. R2R control systems compensate for many of the dynamic issues that stand in the way of high level tool dependability, leading to benefits such as compensation for process variation, improved overlay control, rework reduction, reduction in the use of send-ahead wafers, and increased exposure tool availability. For R2R systems, the integrity of the data from metrology tools is critical. In an automated Fab environment, data is fed directly from measurement tools into databases, where it is used to generate feedback corrections on subsequent production material. Metrology measurements are often based on pattern recognition at the measurement site. Therefore, problems with pattern recognition can lead to flyer data, which in turn may impact the quality of data used in the feedback loop. Using operators to inspect and approve each measurement is costly. In a foundry environment, where multiple products are manufactured, an additional challenge is introduced. Historical data used to generate feedback can often be out of date when the product is combined with tool status. Routine Preventive Maintenance (PM) procedures may require updating some machine constant values that are related to overlay performance. In these cases, the R2R controller should be “Reset” and a new send-ahead wafer should be used. At Tower, a R2R control system, which provides overlay process corrections, was integrated into the production environment. Overlay performance metrics were monitored before and after system introduction to show the benefit of R2R control. Additional work was done to characterize the performance benefit of introducing advanced data filters and tool PM data into the same R2R control system. Results from the additional work show how effectively identifying and removing outliers can improve data integrity, and how tool PM data can be used to appropriately respond to step functions following exposure tool PM adjustments.
Data Modeling for Control II
icon_mobile_dropdown
Yield loss in lithographic patterning at the 65nm node and beyond
Parametric yield loss is an increasing fraction of total yield loss. Much of this originates in lithography in the form of pattern-limited yield. In particular, the ITRS has identified CD control at the 65nm technology node as a potential roadblock with no known solutions. At 65nm, shrinking design rules and narrowing process windows will become serious yield limiters. In high-volume production, corrections based on lot averages will have diminished correlation to device yield because APC systems will dramatically reduce error at the lot and wafer levels. As a result, cross-wafer and cross-field errors will dominate the systematic variation on 300mm wafers. Much of the yield loss will arise from hidden systematic variation, including intra-wafer dose and focus errors that occur during lithographic exposure. In addition, corollary systematic variation in the profiles of critical high-aspect-ratio structures will drive requirements for vertical process control. In this work, we model some of the potential yield losses and show how sensitive focus-exposure monitors and spectroscopic ellipsometry can be used to reduce the impact of hidden error on pattern limited yield, adding tens of millions of dollars in additional revenue per factory per year.
Improving manufacturing variability control in advanced CMOS technology by using TCAD methodology
Jihong Chen, Jeff Wu, Kaiping Liu, et al.
Rapid development of a well controlled manufacturing process is a key component of marketplace success. Accomplishing this requires a thorough understanding of the effects of process variations on parametric yield. Use of Technology Computer Assisted Design (TCAD) simulations and statistical analysis can decrease the time needed to assess the manufacturability of various transistor design options, and identify the key process parameters that cause the largest variations. This paper covers a new methodology that combines Design of Experiments (DOE) with process and device simulations to generate transistor parametric statistical models. Monte-Carlo simulations are performed to generate transistor parametric sensitivities and statistical distributions. Examples of applying this methodology to 130nm technology will be given.
Mix-and-match overlay method by compensating dynamic scan distortion error
Takuya Kono, Manabu Takakuwa, Keita Asanuma, et al.
This paper discusses the compensation method and APC system to reduce errors in mix and matching overlay between scanners. We proposed the compensation model for intra-field errors in mix and matching. And we developed the advanced APC system also to improve dynamic scan distortion using the compensation model.
Necessary nonzero lithography overlay correctables for improved device performance for 110nm generation and lower geometries
Igor Jekauc, Bill Roberts, Paul Young, et al.
Minimizing alignment errors in the past has been fairly straightforward. The aim has always been to drive the overlay model correctables to zero either instantly or after a number of lots processed in a short time frame depending on the controller setup. Methods for improving alignment have included minimizing components of variation tied to the exposure tool, metrology tool, process setup, or the model itself. Instead of working on these components, a less expensive alternative for improving the final outcome as represented by the device performance may be not to minimize the overlay correctables but to instead drive to a specific target as defined by the process window around any such correctable. This paper will briefly show that lithography at present geometries is no longer the sole controller of alignment but that in fact other areas such as films, etch, and CMP influence alignment significantly. It will also be shown that in certain instances vertical wafer topography or feature profile may create device asymmetries, which may be compensated partially through application of non-zero overlay correctables. Coping with decreased overlay performance and methodology for controlling overlay biases is also shown.
Data Modeling for Control I
icon_mobile_dropdown
Simulation benchmarking for the whole resist process
A full lithography simulation has become an essential factor for semiconductor manufacturing. We have been researching all kinds of problems for lithography process by creating and using our own simulation tool, which has contributed to extracting parameters related to exposure, post exposure bake, and development. Also, its performance has been proved in comparison with other simulation tools. In this paper, our lithography simulator and some of its features are introduced. For its benchmark, we describe our own simulator’s performance and accuracy for whole resist process by the comparison of a commercial tool. The sensitivity of process parameters and process latitude due to its parameters are discussed.
Poster Session
icon_mobile_dropdown
Development of customer assistance software for alignment parameter optimization
Wafer alignment plays a significant role in the advancement of microlithography and has been constantly improved to meet various situations. As a result, its configuration is very dynamic and it sometimes requires considerable cost for process optimization. Software has been developed which evaluates the alignment performance in a variety of conditions from the minimal data set. It allows the user to perform off-line optimization, essentially reducing the amount of interruption toward production. This article illustrates the simulation method implemented in the software, OverLay EValuation program (OLEV).