Proceedings Volume 11611

Metrology, Inspection, and Process Control for Semiconductor Manufacturing XXXV

cover
Proceedings Volume 11611

Metrology, Inspection, and Process Control for Semiconductor Manufacturing XXXV

Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 15 April 2021
Contents: 18 Sessions, 91 Papers, 105 Presentations
Conference: SPIE Advanced Lithography 2021
Volume Number: 11611

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 11611
  • Welcome and Plenary Presentation
  • Welcome and Introduction
  • Metrology Keynote Session
  • Overlay Accuracy I
  • Challenges and New Methods
  • Inspection
  • X-Ray and High Aspect Ratio Metrology
  • Contour Metrology
  • Roughness Metrology
  • Process Control
  • Scatterometry
  • Nanosheet and Nanowire
  • Edge Placement Error
  • Overlay Accuracy II
  • Metrology and Inspection for the EUV Era
  • Late Breaking
  • Poster Session
Front Matter: Volume 11611
icon_mobile_dropdown
Front Matter: Volume 11611
This PDF file contains the front matter associated with SPIE Proceedings Volume 11611, including the Title Page, Copyright information, and Table of Contents
Welcome and Plenary Presentation
icon_mobile_dropdown
The future of compute: how the data transformation is reshaping VLSI
Michael C. Mayberry
The digital transformation continues to gain momentum and is changing the shape of business, industry and consumers around the world. This transformation is characterized by continued strong demand for compute at all points in the network – at the core, the edge, and at the endpoints. Data continues to grow at an exponential rate and not only drives the compute requirements, but also requires efficient solutions for movement and storage of data that is critical for overall performance. From device to cloud, new applications and use cases are continuously emerging. In addition to continued dimensional, materials and device scaling, Moore’s Law will evolve to meet the challenges and complexity of heterogeneous 3D integration, and novel architecture integration schemes that will continue to grow. This transformation demands that we adapt our design thinking and move from monolithic self-contained systems to a data/information approach where the system solutions encompasses all the elements needed to convert data to information. These integrated but heterogeneous application-specific design solutions will require different approaches to device scaling to be successful.
A new era for AI HPC and IC technologies in the transition to an intelligent digital world
A New era for AI HPC and IC technologies in the transition to an intelligent digital world The massively parallel processing nature of GPU and AI deep learning accelerator (DLA) architectures has enabled the scaling up of computing power, to handle the massive data and large DNN models. Today’s state of the art AI chip in the market, GA100, has a GPU die size close to the reticle limit with 54 billion transistors; it consists of 288 billion transistors in a 2.5D Interposer integrating GA100 GPU/DLA & HBM3, with up to 2.5 Peta Flops of AI inferencing computing power, in a single package that rivals the world’s top supercomputers from 10 years ago. IC industry has been propelled by exponential growth following Moore’s Law since 1960s to 2000. This IC technology scaling is driven by key factors such as performance, power, perfection (yield/reliability), area, cost, and time to market (PPPACT). Since then, Moore’s Law has slowed down significantly as the result of four main challenges: Power Wall since 2000, Performance Scaling Wall since 2010, Area Scaling Wall since 2010s, and the Cost and Time-to-market Wall since late 2010s. On the other hand, since 2012 there has been big progress in three key areas: big data, AI algorithms, and the advancement of GPU and AI accelerators. These have facilitated the rapid transition into an intelligent digital world. Today’s GPUs/DLAs have enabled AI for trillions of devices, with applications ranging across scientific discoveries, medical and drug discovery, robotics, autonomous driving, smart city, IOT, and industrial applications including in the IC and lithography fields etc. In this talk, we will also show examples of the advancements in AI computing and GPU rendering/ray tracing, as well as their applications in various fields. The rapid transition into an intelligent digital world has also demanded a tremendous increase in computing power. For example, the AI model complexity has increased 30000 times in past five years and is currently doubling every two months. To meet the demand for the exponential growth of computing power beyond Moore’s law, full stack innovations are required, including algorithms, system and chip architecture, and IC technologies and materials. Following the Huang’s Law, GPU AI computing power has increased 2x every year since the beginning of the AI Era – an improvement enabled by full stack innovations. In addition, the increase in transistor counts per chip from scaling, and the lower DPPM reliability requirement by autonomous driving applications, has demanded defect-per-chip to reduce by 2x every two years. Today’s advanced chips need to be below one defect per trillion. We will discuss the challenges and opportunities for advanced IC technologies (in particular lithography) and design improvements to achieve continued scaling. The progress in GPU and AI computing has also provided additional tools to solve the problems in designing and productizing future giga chips. AI applications have been used extensively in various fields: IC design, OPC and mask making, IC processing and control, quality & productivity improvements, defect inspection, yield and functionality analysis etc. CNN and GAN are also used to ensure the patterns processed on wafer matches closely to design intentions. We are at the era of an intelligent digital world empowered by AI. The advancements of GPU AI computing improvements following Huang’s Law has fueled the exponential growth of AI applications. A full stack innovation aided by AI is also essential to help the IC and AI/HPC compute industry meet the demand for increased computing power and improved perfection.
Nanophotonics as computing machines
Merging the concept of metamaterials with the nanophotonics platform brings new functionalities to the science and technology of both fields. Since photons are usually controlled by materials, metamaterials have provided interesting platforms for engineering light-matter interaction. In recent years, a variety of features and exciting applications of meso-, micro-, and nanoscale metamaterials have been explored and developed. One of these applications is the development of material-based wave-based analog computing machines, which we are investigating in my group. We have designed material structures that can perform mathematical operations as waves propagate through them. Such materials have effectively become computing machines. For example, we have explored metastructures that can solve integral and differential equations and can invert matrices using waves as incoming waves interact with them. Moreover, we have also been investigating how collections of other photonic networks (such as the Mach-Zehnder interferometers), either by themselves or together with metamaterials, can provide another light-based method for solving equations and performing mathematical operations with light. In this talk, I will discuss the most recent results in some of our ongoing research programs on this topic, will present physical insights into these results, and will forecast future research directions in this area.
Welcome and Introduction
icon_mobile_dropdown
Welcome and Introduction to SPIE Conference 11611
Introduction to SPIE Advanced Lithography conference 11611: Metrology, Inspection, and Process Control for Semiconductor Manufacturing XXXV.
Metrology Keynote Session
icon_mobile_dropdown
The challenges of in-fab metrology: the needs for innovative solutions
ChungSam Jun
Recent semiconductor development has been suffering from the continuously increased time for yield ramp-up and the increased cost due to the several challenges in manufacturing process coming from the device scaling as well as 3D stacking. In addition, the difficulty of process monitoring also leads to the increased cost because of the increased failure rate of the device. Despite of impressive advancements in metrology and inspection field, metrology issues keep growing over the device generations, while cost of the operation is getting worse due to the increased equipment price and lower throughput. Now, the metrology techniques reached physical limitations for using the next-generation semiconductor products based on nanoscopic 3D structures. In particular, reliably covering and monitoring the various process splits in the R&D fab becomes very challenging due to the lack of non-destructive 3D measurement technology. Without an innovative in-fab metrology solution, current monitoring technique must be relying on the destructive analysis, those are time-consuming and expensive. In this presentation, I will highlight these challenges and explore the needs for developing the innovative, non-destructive 3D in-fab metrology solution.
The emergence of inline screening for high volume manufacturing
Oreste Donzella, John C. Robinson, Kara Sherman, et al.
The semiconductor content of automobiles is growing rapidly in applications where quality is of paramount importance, and automotive manufacturers have taken the lead in driving a “Zero Defect” mentality into their supply chain. The motivation behind this paper started with engagements with semiconductor suppliers as well as automotive manufacturers, where KLA witnessed many clear examples of killer defects passing through test with the potential to enter the automotive supply chain. The current method to drive towards “Zero Defect” levels of chip quality involve two main approaches: process control and electrical test screening. The industry is poised for a new complementary solution that combines the unique detection of physical defects from the production line with the 100% coverage of electrical test, providing high speed inspection on the most critical reliability layers covering 100% of lots and 100% of wafers to look at each individual die for screening purposes, rather than just controlling the process. By inserting a complementary inline inspection screen, Inline Part Average Testing (I-PAT) can help stop maverick wafers and isolate very defective die. Over the last 2 years KLA has executed proof-of-concept studies for I-PAT using a prototype engineering system. These studies have been conducted at 5 automotive semiconductor integrated device manufacturers (IDMs) and foundries, including logic devices with embedded memory, analog devices, and power semi devices of both silicon and silicon carbide, spanning from 40 to 350nm design rules. The data sets include information on 1.6 million chips to create a meaningful statistical modeling approach. Two case studies are presented that illustrate the effectiveness of inline die screening. Finally, quality and reliability-critical applications beyond automotive are discussed including hyper-scaling data centers and multi-die packaging.
A yield-centric perspective on the growing eBeam role in patterning control
As semiconductor end markets diversified from computers and Internet to mobile and social media, computing demand increased from hundreds of millions to billions of units. With the advent of AI, and big data end markets further diversified to IOT, communications, self-driving cars, power, and sensors, resulting in demand for computing that ski-rocketed to the trillions. To fuel this growth PPACt :Power, Performance, Area-cost and time-to-market needed to accelerate. This acceleration is enabled by new architectures, 3D structures, new materials, new ways to shrink, and advanced packaging. An obstacle limiting this growth was identified as loss of correlation between the new aforementioned semiconductor device structures, and the measurements conventional optical workhorse metrology provided. This was not a one-time calibration event. Optical metrology increased reliance on accuracy check, ground truth and calibration is a result of increasing optical metrology accuracy gaps over time, causing inflating edge placement error budgets, instability in litho to etch bias and non-zero offset over time. A paradigm shift in overlay, CDU, and edge placement error patterning control metrology was sought to accelerate PPACt at needed yield. This talk tells the story of how to introduce a new standard to a needing industry. We start with addressing the need for a new technology enabling a fast non-destructive accurate inline metrology, we then validate its accuracy overtime using two different industry standard reference metrologies. And we conclude with testimonies of chipmakers showcasing the use of the new industry standard of patterning control. To enable high throughput accuracy in overlay, EPE, and NZO metrology a highly efficient BSE see-through ebeam technology was introduced and benchmarked against optical metrology. Showing reduced non zero offset residuals on device, with the capability to accurately gauge litho-to-etch bias. As a good engineering practice STEM and Delayer techniques validated the accuracy of the inline ebeam technology so it could become the new workhorse to assume the role of HVM overlay metrology. We conclude with three chipmakers case studies of how the use of the new workhorse ebeam metrology accelerated cycles of development, reduced reliance on destructive metrology enabling faster process splits in fabrication of three dimensional nanosheet transistors, faster ramp of semiconductor manufacturing and better yield by better on device overlay and NZO at HVM.
Overlay Accuracy I
icon_mobile_dropdown
See-through imaging by HV-SEM: a simulation study
High Voltage SEM (HV-SEM) is now becoming more common in typical fab usage for achieving “see-through” overlay and detecting buried defects. To improve understanding of capabilities and limits, JMONSEL is used to simulate HV-SEM imaging of various buried test structures to explore how apparent resolution depends on incident beam energy, target and overlayer material, and overlayer material depth, thus improving understanding of HV-SEM’s limits in terms of what can be seen at what energy through how much overlayer, which is important for use case consideration and optimization.
Review of scanning electron microscope-based overlay measurement
Overlay control has been one of the most critical issues for manufacturing of leading edge semiconductor devices. Scanning electron microscope-based overlay (SEM-OL) metrology can directly measure both overlay targets and actual devices or device-like structures with high spatial resolution. SEM-OL uses small structures which allows insertion of many SEM-OL targets across a die. Precise overlay distribution can be measured using dedicated SEM-OL mark, improving measurement accuracy and repeatability. To extend SEM-OL capability, we have been developing SEM-OL techniques that can measure not only surface patterns by critical dimension SEM but also buried patterns for leading edge device processes.
High voltage scanning electron microscope overlay metrology accuracy for after development inspection
S. Czerkas, N. Gutman, R. Gronheid, et al.
We show that an overlay (OVL) metrology system based on a scanning electron microscope can achieve accurate registration of buried and resist (top) structures. The positions were determined by both Back Scattered Electrons (BSE) and Secondary Electrons (SE). The accuracy was quantified for After-Development Inspection (ADI) of an advanced EUVL process. Results by linear tracking showed accuracy below 0.4nm, robust across process variation and target designs. The influence of various measurement conditions, e.g. Field of View, on position and OVL tracking was negligible. The measurement methodology presented is applicable for both standalone High Voltage SEM (HV-SEM) registration targets and optical targets, such as the Advanced Imaging Metrology (AIM®) target used by Imaging Based Overlay (IBO) metrology systems. Using SEM ADI OVL results as a calibration for optical overlay metrology tools we can demonstrate significant improvements in the optical ADI OVL accuracy on small targets like AIM in-die (AIMid).
Characterization of metrology to device overlay offset and novel methods to minimize it
Kaustuve Bhattacharyya, Ken Chang, Jeff Lin, et al.
An after etch overlay measurement on device is typically used as a reference overlay as this is what determines the final overlay. The delta between on target overlay from after develop (ADI) and this reference overlay on device after etch (AEI) is known as the metrology to device offset (MTD). As the fab overlay is controlled by a run-to-run control of ADI overlay, it is preferred to minimize the MTD. The MTD concept in overlay metrology has long been present in the industry and many ways to mitigate this problem have been adopted (such as designing overlay target at ADI that has a similarly low aberration response as the device, or dialing in a static offset between ADI and AEI overlay, etc.). As overlay margins continue to shrink, existing methods start to show gaps and are insufficient to suppress the MTD to an acceptable level on the few most critical overlay layers. In order to address this gap, we need to deploy a much wider solution space that provides an integrated design-lithography-etch solution. In order to characterize the MTD, (assuming that target design in ADI has already minimized aberration response delta between target and device), it is important to understand that there are two major components to MTD: (1) an inaccuracy in ADI overlay (metrology artifacts mostly due to the presence of target asymmetry) and (2) etch to litho offset due to any post ADI added effects such as etch induced expansion and/or stress release etc. However, the above two components are strongly coupled and traditional characterization methods have difficulty in separating their contribution to the measurement. In this technical paper we will discuss novel methods (data driven as well as model-based) to decouple these and multi-lot results will show that MTD can be further minimized compared to traditional static correction between ADI and AEI.
Machine learning for Tool Induced Shift (TIS) reduction: an HVM case study
Boaz Ophir, Udi Shusterman, Anna Golotsvan, et al.
Tool Induced Shift (TIS) is a measurement error commonly used to measure the accuracy of metrology tools. TIS manifests in the difference in overlay (OVL) misregistration between measurements of the same target at 0ᴼ and 180ᴼ rotations. This inaccuracy is attributed to tool asymmetries and is commonly caused by lens aberrations, lens alignment, illumination alignment and the tool’s interaction with target asymmetries. TIS impacts tool Total Measurement Uncertainty (TMU) and tool-to-tool matching. In memory chips, particularly 3D NAND, TMU is limited by TIS distribution across wafer, as it depends on process stability and is amplified by high layer topology. Additionally, TIS is influenced by wafer-to-wafer and lot-to-lot process variation. TIS correction by direct measurement per site (TIS-onLink, ToL) incurs a heavy penalty to measurement throughput as it requires measuring each site twice. Alternatively, measuring TIS on a sparse subset of sites, interpolating to other sites (TIS-on-Parent, ToP), induces a lower throughput penalty but is not accurate enough in many cases. In a previous paper we introduced a new methodology to improve overlay measurement with minimum throughput impact - Modeled-TIS (mTIS). This approach uses Machine Learning (ML) algorithms to predict per-site TIS correction on Image-Based Overlay (IBO) measurements. This method gives near ToL TIS correction performance at ToP throughput penalty, or better, depending on the use case. In this paper, we describe some of the algorithmic adaptations we made to the original algorithm to work in a high-volume manufacturing (HVM) environment and present results of an HVM use case on 3D NAND production lots.
Challenges and New Methods
icon_mobile_dropdown
High volume manufacturing metrology needs at and beyond the 5 nm node
This presentation will examine at a high level the future for in-line high volume manufacturing (HVM) metrology for the semiconductor industry. First, we will take a broad view of the needs of patterned defect, critical dimensional (CD/3D) and films metrology, and present the extensive list of applications for which metrology solutions are needed. Progress will be shown in terms of the IEEE-IRDS roadmap. We will then report on the gating technical limits of the most important of these metrology solutions to address the metrology challenges of future nodes, highlighting key metrology technology gaps requiring industry attention and investment.
Microsphere-assisted ultra-small spot spectral reflectometry technique for semiconductor device metrology
Soonyang Kwon, Kwangrak Kim, Jangryul Park, et al.
We propose a novel spectrum measurement system named Microsphere-assisted nanospot spectroscopic reflectometry (MASR) by using the super-resolution image over the Rayleigh’s resolution limit in white-light. The proof of concept and optimal configuration of MASR are fully verified by utilizing both FDTD and ray optics simulations. At the same time, we experimentally validate the usefulness of MASR by obtaining the spectra from extremely small spot of 210 nm which is 119X smaller than 25 μm of conventional spectrum measurement system due to 530X super resolution enhanced magnification by optimally using microsphere that is definitely overcoming the physical limit of the optical resolution. It is important to note that the proposed technique has a capability to measure the spectrum from the extremely tight area, diameter of 210 nm, resulting in monitoring in-cell uniformity and structural changes in narrow and small area device patterns. Furthermore, our system can be combined with various optical measurement systems as a module which can upgrade optical resolution and magnification. To the best of our knowledge, this is the world first demonstration of completely new concept of the system and the method to overcome the metrology challenges we are currently facing.
High-speed wafer film measurement with heterogeneous optical sensor system
Doo-Hyun Cho, Seung Beom Park, Sung-Ha Kim, et al.
The inspection of thin-film thickness on a wafer is one of the key steps for the semiconductor manufacturing processes. This paper proposes estimating the film thickness profile of the wafer, where the 3-band RGB color imaging camera and the hyperspectral imaging module are utilized to achieve the robust metrology performance. The simulation results are designed for investigating the characteristics of estimated film thickness profiles based on the Gaussian process regression. We demonstrate this cost-effective solution is beneficial for monitoring the CMP process with small computational power. The proposed measurement method has a great potential to solve bottlenecks from the physical metrology processes.
Metrology of thin layer deposition with combined XRR-GIXRF analysis at SOLEIL synchotron
Y. Ménesguen, M.-C. Lépy
A new instrument called CASTOR is operated at the SOLEIL synchrotron facility and is dedicated to the characterization of thin films with thicknesses in the nanometer range. The instrument can combine X-ray reflectivity (XRR) measurements with fluorescence (XRF) acquisitions and especially total reflection X-ray fluorescence (TXRF) related techniques such as grazing incidence XRF (GIXRF). The instrument is now routinely installed on the hard X-ray branch of the Metrology beamline and reproducibility is studied as well as reference-free GIXRF analysis. Some representative examples are given to illustrate the capabilities of the setup and of the analysis.
An innovative probe microscopy solution for measuring conductivity profiles in 3-dimensions
The ever-increasing complexity of materials and architectures in nanoelectronics devices has driven the demand for new high-resolution imaging methods. Specifically, for three-dimensional (3D) analysis of confined volumes, atomic force microscopy (AFM) has been recently explored as a method for tomographic sensing. Here, we report on the innovative design of a dedicated microscopy solution for volumetric nanoscale analyses that achieves tomographic AFM by using a novel multi-probe sensing architecture. First, we describe the development of a custom scan head that is based on an exchangeable multi-probe hardware. Second, we demonstrate the use of our machine for tip-induced material removal in thick SiO2. Finally, we perform a tomographic reconstruction of nanosized poly-Si vertical channels, considered here as a prototypical system for vertical memory cells.
Scanning microwave impedance microscopy for materials metrology
Nicholas Antoniou, Ravi Chintala, Yongliang Yang
Metrology of the properties of materials is a growing concern in semiconductor manufacturing and as a result, new and existing technologies are being adapted to address gaps in established capabilities. We introduce here Scanning Microwave Impedance Microscopy (sMIM), a technology that can measure critical material properties such as dielectric constant (k-value), capacitance, resistivity and permittivity at the nanoscale. In this technique, an AFM cantilever is used as a microwave source to measure the electrical properties of materials at nanometer scale. sMIM is sensitive to the local capacitive and conductivity changes in a material making it an excellent method to apply on a wide range of materials such as insulators, semiconductors, ferroelectrics and other. An sMIM measurement provides the permittivity and conductivity of films. From this measurement you can derive properties such as dopant concentration, resistivity and identify certain defects. Since microwaves can penetrate deep into the sample, we can measure sub-surface layers as well. The resolution of the sMIM ScanWave system for capacitance measurements has been calculated to be better than 0.3 aF and the repeatability is well below 1% RSD 1σ making suitable for very sensitive process control of dielectric films and dopant concentration. Types of measurements presented here include nano C-V for dielectric film quality, dielectric film k value and dopant concentration. Additional applications are envisioned in emerging memory materials.
Inspection
icon_mobile_dropdown
A new metrology technique for defect inspection via coherent Fourier scatterometry using orbital angular momentum beams
Coherent Fourier scatterometry (CFS) via laser beams with a Gaussian spatial profile is routinely used as an in-line inspection tool to detect defects on, for example, lithographic substrates, masks, reticles, and wafers. New metrology techniques that enable high-throughput, high-sensitivity, and in-line inspection are critically in need for next-generation high-volume manufacturing including those based on extreme ultraviolet (EUV) lithography. Here, a set of novel defect inspection techniques are proposed and investigated numerically [Wang et al., Opt. Express 29, 3342 (2021)], which are based on bright-field CFS using coherent beams that carry orbital angular momentum (OAM). One of our proposed methods, the differential OAM CFS, is particularly unique because it does not require a pre-established database for comparison in the case of regularly patterned structures with reflection symmetry such as 1D and 2D grating structures. We studied the performance of these metrology techniques on both amplitude and phase defects. We demonstrated their superior advantages, which shows up to an order of magnitude higher in signal-to-noise ratio over the conventional Gaussian beam CFS. These metrology techniques will enable higher sensitivity and robustness for in-line nanoscale defect inspection. In general, our concept could benefit EUV and x-ray scatterometry as well.
In-line schematic failure analysis technique by defect SEM images
Junya Okude, Chihiro Ida, Kazuhiro Nojima, et al.
To fix the root cause of electrical failure chips, we do failure analysis by an electrical test. However, this analysis takes much long time because an electrical test is done after a few months since the defect occurred in in-line processes. To reduce the analysis time, we used the defects detected by in-line optical inspections of post semiconductor process steps. In order to identify the position of the defects that caused the failure, we used to match CAD contour with a DR-SEM (Defect Review-SEM) image contour of the defect. But the “hit rate” of the defect was not so high. Here hit rate is a rate that the defects cause an electrical failure chip. There were two reasons. First, the matching success rate was low because extracting contour from SEM is inaccurate. Second, CAD was a mask pattern and didn’t include the circuit node information, so there was an over-detection such as a short between dummy nodes. We propose a high precision in-line schematic failure analysis technique by machine learning and circuit node information. For matching pixel to pixel, we match Fake-SEM generated by GAN instead of CAD with DR-SEM. Next we make the CAD that is added the defect, and a design verification technique LVS generates circuit diagram. When the defect’s diagram is different from reference, we classify the defect cause an electrical failure. We confirmed that this technique could dramatically improve classification accuracy of the defect of root cause in manufacturing with our memory device.
e-Beam detection of over-etch in semiconductor processing and how over-etch level is related to defect detection parameters
Kwame Owusu-Boahen, Chang (Carl) Han, Ching Hsueh, et al.
Some over-etch (OE) related defects in semiconductor device processing are only obvious after vias or trenches are already filled. Such defects are usually buried and often discovered after failure analysis from failed devices. Inline detection by physical means using optical inspection tools is not possible. e-Beam inspection has the ability to detect this type of defect electrically. OE related defects create shorts or leakage paths and their ability to cause device failure depends on the level or extent of this leakage. Hard OE fail impacts yield while marginal OE is relatively harmless. e-Beam inspection detects both hard OE fail and marginal OE as bright voltage contrast (BVC) and it has always been a challenge to discern yield impacting hard OE fail from the relatively harmless OE based only on the defect images. TEM analysis is often necessary to distinguish between the two. In this paper attempt is made to relate the extent of OE to e-Beam defect detection parameters, Threshold (TH) and Grey Level (GLV). Correlation between the amount of OE and each of the two parameters is established. Also a correlation is found among the two parameters themselves. With these relationships established, the e-Beam defect detection parameters alone can be used to predict OE’s potential impact on yield without TEM analysis.
Frequency encoding scheme for reticle front and back side inspection
Michal E. Pawlowski, Corey Loke, Aage Bendiksen, et al.
EUV and DUV reticles are inspected before exposure to guarantee the quality of the lithographic process. The pellicle and the reticle back side are typically inspected for the presence of contaminants which are several micrometers in diameter. While detection of macroscopic particles is fairly straightforward, delineation between particle and stray light due to diffractive properties of reticle pattern is challenging. We present a particle detection system based on structural illumination technology capable of automatic delineation between contaminants and stray light. During measurement a series of images is acquired by a macro-imaging optical detector (e.g. camera) positioned normal in reference to the investigated sample. The surface under test is illuminated at an oblique angle by an incoherent polychromatic projection system projecting a pattern whose temporal frequency changes across the inspected surface. The signal acquired by the detector is in general an incoherent sum of light scattered by particles with stray light returned by the diffractive pattern. Temporal samples of intensity data are analyzed pixel by pixel in the Fourier domain and the particles are differentiated from the stray light by comparing the Fourier spectra of the acquired signal with the frequency a particle should generate at an analyzed pixel based on the system geometry. Thus, unwanted stray light signal contributions are separated from particle signal by performing logical operations on the signal in the Fourier domain. Experimental results from a technology demonstrator are provided to illustrate performance.
Defect simulation in SEM images using generative adversarial networks
Zhe Wang, Liangjiang Yu, Lingling Pu
SEM image processing is an important part of semiconductor manufacturing. However, one difficulty of SEM image processing is collecting enough defect-containing samples of defect-of-interests (DOI) because many DOIs are very rare. This problem becomes more prominent for Machine Learning (ML) or Deep Learning (DL) based image processing techniques since they require large amount of samples for training. In this paper, we present a Generative Adversarial Networks (GAN) based defect simulation framework to tackle this problem. The fundamental insight of our approach is that we treat the defect simulation problem as an image style transfer problem. Following this thought, we train a neural network model to turn a defect-free image into a defect- containing image. We evaluate the proposed defect simulation framework by using it as a data augmentation method for ML/DL based Automatic Defect Classification (ADC) and Image Quality Enhancement (IQE) on a Line Pattern Dataset, which is collected with ASML ePTMand eScan R series inspection tools from an ASML standard wafer. The experimental results show a significant performance gain for both ADC and IQE. The result proves our defect simulation framework is effective. We expect GAN based defect simulation can have a broader impact in many other SEM image development and engineering applications in the future.
Electrical validation of massive E-beam defect metrology in EUV-patterned interconnects
Nicola Kissoon, Etienne De Poortere, David Hellin, et al.
EUV resists, while improving steadily, generate a number of nanobridge or break defects that increases quickly as the pitch approaches 30 nm. Inline inspection methods are therefore needed to reliably detect patterning defects smaller than 20 nm. Massive e-beam metrology provides the high resolution needed to measure these defects, while remaining compatible with HVM throughput requirements. In this work, we used a direct metal (Ru) etch process, to fabricate EUV-patterned electrical structures in the 32 nm-36 nm pitch range. We demonstrate an almost one-to-one correspondence between the e-beam metrology yield of the structures, and their electrical yield. The e-beam inspection is realized with a large-field-of-view HMI eP5 e-beam system. The match between e-beam and electrical yield shows that our e-beam inspection is able to catch all electrically relevant line breaks, while excluding false flags. These results demonstrate the capability of massive e-beam inspection in predicting electrical yield.
X-Ray and High Aspect Ratio Metrology
icon_mobile_dropdown
GIXRF and machine learning as metrological tools for shape and element sensitive reconstructions of periodic nanostructures
The characterization of nanostructures and nanostructured surfaces with high sensitivity in the sub-nm range has gained enormously in importance for the development of the next generation of integrated electronic circuits. A reliable and non-destructive characterization of the material composition and dimensional parameters of nanostructures, including their uncertainties, is strongly required. Here, an optical technique based on grazing incidence X-ray fluorescence measurements is proposed. The reconstruction of a lamellar nanoscale grating made of Si3N4 is presented as an example. This technique uses the X-ray standing wave field, which arises due to interference between the incident and the reflected radiation, as nanoscale gauge. This enables the spatial distribution of the specific elements to be reconstructed using a finite-element method for the calculation of the standing wave field inside the material. For this, the optical constants for the constituent materials of the structure are needed. We derived them from soft X-ray reflectivity measurements on an unstructured part of the wafer sample. To counteract the expensive computation of the finite-element-Maxwell-solver, a Bayesian optimizer is exploited to obtain a most efficient sampling of the searched parameter space. The method is also used to determine the uncertainties of the reconstructed parameters. The homogeneity of the sample was also analyzed by evaluating several measurement spots across the grating area. For the validation of the reconstruction results, the grating line shape was measured by means of Atomic Force Microscopy.
High resolution profiles of 3D NAND pillars using x-ray scattering metrology
High-performance memory production requires discovery of process anomalies that affect device performance. Optical metrology can monitor CDs of device structures, but has limited resolution for 3D NAND, especially towards the bottom of high aspect ratio (HAR) structures. Destructive, offline techniques (e.g., cross-sectional SEM, FIB, TEM) can be used for high resolution HAR profile measurement, but are time consuming, offering little statistical feedback for accurate process control. As an alternative, small angle X-ray scattering (CD-SAXS) can measure 3D HAR structures precisely and non-destructively. CD-SAXS has high sensitivity to 3D profiles, including profiles of CD, ellipticity and tilt through the entirety of the channel hole. CD-SAXS is non-destructive and used inline, providing fast time to results for efficient process characterization. In this paper, we will show data from a CD-SAXS system, including high-resolution CD, ellipticity and tilt profiles, and etch depth of 3D NAND channel holes with strong correlation to reference metrology. With this production-ready CD-SAXS system, flash memory CD profiles can be monitored inline for improved time to an optimized channel etch process.
Comparative near infrared through-focus scanning optical microscopy for 3D memory subsurface defect detection and classification
Through-focus scanning optical microscopy (TSOM) is a model-based optical metrology method that involves the scanning of a target through the focus of an optical microscope. Nanometer scale sensitive information is then extracted by matching the target TSOM data/image to reference TSOM data/images that are either experimentally or computationally collected. The nanometer sensitivity was previously confirmed by several theoretical and optical implementations. However, these studies all involved application to wafer patterns on the top surface. The present study extends the TSOM method to subsurface defect detection and classification without destruction, which becomes extremely important due to increasingly widely employed 3D semiconductor technologies. First, we apply a near-infrared (NIR) beam as illumination light in order to allow defect identification over the entire device depth. In addition, we adopt a model-less TSOM approach since the construction of a TSOM reference database for 3D pattern structures such as 3D NAND flash memory is hardly practical. We therefore employ a comparative TSOM method in which a TSOM data cube/image is compared with an image of an adjacent die or that of a “golden” die known to be defect free. We report the results of the first application of this method to an Intel 3D NAND flash and show that substantial subsurface defects are detected and classified.
Measurability analysis of the HAR structure in 3D memory by T-SAXS simulation
Kaori Sasaki, Takaki Hashimoto, Yenting Kuo, et al.
To control the HAR (High Aspect Ratio) processes for producing 3D memories, a non-destructive and highly accurate measurement method is required. We report simulation results of T-SAXS (Transmission Small Angle X-ray Scattering) measurability analysis to evaluate its measurement capability of profile parameters for typical HAR structures. After that, we discuss T-SAXS extensibility for profile measurements of future 3D memories, based on measurability analysis for various HAR structural models by varying their structural parameters. For the deep depth region which is important for the future 3D memory shape measurement, we confirmed that the HAR structure with its depth = 30 μm can be measured under the assumed criteria of precision < 1%.
Fast in-device overlay metrology on multi-tier 3DNAND devices without DECAP and its applications in process characterization and control
Yaobin Feng, Pandeng Xuan, Dean Wu, et al.
Multilayer stack height in 3DNAND has reached the limit of the aspect ratio that etch technologies can cost-effectively achieve. The solution to achieve further bit density scaling is to build the stack in two tiers, each etched separately. While lowering the requirements on etch aspect ratio, stacking two tiers introduces a critical overlay at the interface between the stacks. Due to the height of each stack, stress- or etch-induced tilt in the channel holes is translated into overlay. Characterizing and controlling the resulting complex overlay fingerprints requires dense and frequent overlay metrology. The familiar electron beam metrology after etch-back (DECAP) is destructive and therefore too slow and expensive for frequent measurements. This paper will introduce a fast, accurate & robust data-driven method for In Device Overlay Metrology (IDM) on etched 3DNAND devices by making use of specially designed recipe setup targets. Also, potential applications for process control improvement will be demonstrated.
X-ray critical dimension metrology solution for high aspect ratio semiconductor structures
Matthew Wormington, Adam Ginsburg, Israel Reichental, et al.
We have developed a novel in-line solution for the characterization and metrology of high-aspect ratio (HAR) semiconductor structures using transmission small-angle X-ray scattering (SAXS). The solution consists of the Sirius-XCD® tool, NanoDiffract for XCD (NDX) analysis software and high-performance computing infrastructure. The solution provides quantitative information on the orientation and shape of HAR structures, such as 3D NAND channel holes and DRAM capacitors, and can be used for development and control of the critical etch processes used in the formation of such structures. The tool has been designed to minimize expensive cleanroom space without sacrificing performance with typical measurements taking only a few minutes per site. The analysis is done using real-time regression in parallel to the measurements to maximize the throughput of the solution. We will illustrate the key features of the solution using data from a HAR reference wafer and provide results for hole shape and tilt across the wafer together with complimentary data from other techniques. We will also discuss future opportunities for both stand-alone XCD applications and possibilities of XCD-OCD synergies including hybrid metrology in solving complex high-aspect ratio (HAR) and other applications.
Evaluation of deep learning model for 3D profiling of HAR features using high-voltage CD-SEM
3D-NAND memory will continue to increase in the aspect ratio of channel holes. High throughput and in-line monitoring solutions for 3D profiling of high aspect ratio (HAR) features are the key for yield improvement. A deep learning (DL) model has been developed to improve the 3D profiling accuracy of the HAR features. In this work, the HAR holes with different bowing geometries were fabricated and a high-voltage CD-SEM was used to evaluate the performance of the DL model. The accuracy and the sensitivity of the DL model was evaluated by comparing the predicted cross-sections with the X-SEM measurement. The results show that the DL model enables the maximum CD (MCD) of the bowing features to be predicted with a sensitivity of 0.93 and its depth position to be predicted with a sensitivity of 0.91. The DL learning model reduced the absolute error of the predicted MCD depth position from several hundreds of nanometers, the error occurring when using the exponential model, to within 100 nm.
Contour Metrology
icon_mobile_dropdown
Investigating SEM-contour to CD-SEM matching
The control and the characterization of semiconductor very fine devices on a wafer are commonly performed by mean of a scanning electron microscope (SEM) to derive a critical dimension (CD) from a pair of parallel edges extracted from the images. However, this approach is often not very reliable when dealing with complex 2D patterns. An alternative is to use SEM contour technique to extract all the edges of the image. This method is more versatile and robust but before being implemented in a manufacturing environment, it must demonstrate that it can be matched well with traditional CD-SEM. Aim: The objective of this work is to present a method to evaluate and optimize the CD matching between a reference standard SEM-CD and SEM-Contours. Approach: After describing the metric used to assess the matching performance, we propose to screen some important influent parameters to give an evaluation of the best matching that we achieved with our experimental data. Results: After optimizing the matching calibration parameters and optimizing the selection of the best anchor pattern for the matching we could achieve a 3s-Total Measurement Uncertainty of 0.8 nm and 3.2 nm for 1D and 2D patterns. Conclusions: We established a method to achieve good matching performance that should facilitate the introduction of SEM contour in a manufacturing environment.
Pattern placement and shape distortion control using contour-based metrology
Bertrand Le Gratiet, Régis Bouyssou, Julien Ducoté, et al.
Despite of the fact that thousands of CD-SEM (critical dimension scanning electron microscope) images are acquired in a daily basis in a fab, limited metrology is performed. Usually these images will not serve other purposes after they are collected and measured, but as they are stored, post-process analysis can be applied. Initially, most of these images are used to perform CD metrology, even though many other types of metrics could be extracted from the same images, especially when using contour metrology. In this paper two use cases will be explored, where contour-based image processing is performed on typical inline metrology targets. In both cases, initial intended metric was CD but thanks to contour based image computing, complementary information can be extracted. In the first use case, CD and overlay metrics can be extracted, while in the second CD, etch slanting and asymmetry analysis is performed across the wafer. Contour-based metrology offers new capabilities to dissociate several layers (e.g. via and line) or elements (e.g. top and bottom) in the image so that interlayer and intralayer metrics, other than width dimensions, can be computed. Besides, a solution not integrated in the tool provides excellent versatility to re-process images, thus allowing the obtention of new metrics, which can be very helpful also for retro-analysis.
Roughness measurement of 2D curvilinear patterns: challenges and advanced methodology
Jonathan Pradelles, Loïc Perraud, Aurélien Fay, et al.
2D curvilinear patterns are more and more present in the lithography landscape. For the related devices, the line edge roughness (LER) is, as well as for lines and spaces, a critical figure of merit. In this article we propose to use a dedicated edge detection algorithm to measure LER of 2D curvilinear patterns on CD-SEM images. We present an original method to validate the algorithm, in the context of roughness measurement. It is based on the generation of realistic synthetic CD-SEM images with programmed roughness and a precise PSD analysis flow. We show excellent correlation (average R2 = 0.988) between the input roughness parameters and the measured parameters for both 1D and 2D synthetic images. Using synthetic images for different number of frames, the contour extraction sensitivity to noise is also explored. Finally, the methodology is successfully applied to experimental CD-SEM images for two classes of applications : photonic devices and DSA fingerprint patterns.
Massive metrology and inspection solution for EUV by area inspection SEM with machine learning technology
Tsuyoshi Kondo, Naoma Ban, Yasushi Ebizuka, et al.
As the development of Extreme Ultraviolet Lithography (EUVL) is progressing toward the sub-10nm generation, the process window becomes very tight. In this situation, local Critical Dimension (CD) variability including stochastic defect directly affects the yield loss, and it is very important to inspect/measure all patterning area of interest on chip for the process verification. In this paper, by combining Area Inspection SEM (AI-SEM) with large Field Of View (FOV) and Die-to-Database-base (D2DB) technologies, we show a comprehensive solution for fast inspection and precise massive CD measurement of EUV characterized features, such as After Development Inspection (ADI) hole pattern, and aperiodic 2D Logic pattern. Also, a big data analysis consisting of multiple CD indices output by AI-SEM, a new process window by multivariable analysis is discussed. Furthermore, Machine Learning (ML) -based inspection and metrology to maximize imaging speed, is also reported.
Better prediction on patterning failure mode with hotspot aware OPC modeling
A method to perform Optical Proximity Correction (OPC) model calibration that is also sensitive to lithography failure modes and takes advantage of the large field of view (LFoV) e-beam inspection, is presented. To improve the coverage of the OPC model and the accuracy of the after development inspection (ADI) pattern hotspots prediction - such as trench pinching or bridging in complex 2D routing patterns - a new sampling plan with additional hotpot locations and the corresponding contours input data is introduced. The preliminary inspected hotspots can be added to the traditional OPC modeling flow in order to provide extra information for a hotspot aware OPC model. A compact optical/resist 3D modeling toolkit is applied to interpret the impact of photoresist (PR) profiles, as well as accurate predictions of hotspot patterns occurring at the top or bottom of the PR. A contour-based modeling flow is also introduced that uses a site or edge based calibration engine, to better describe hotspot locations in the hotspot aware OPC model calibration. To quantify the improvement in pattern coverage in the modeling flow, feature vectors (FVs) analysis and comparisons between the conventional and the hotspot aware OPC models is also presented.[1] The time and cost of using conventional Critical Dimension Scanning Electron Microscope (CD-SEM) metrology to measure such a large amount of CD gauges are prohibitive. By contrast, using LFoV e-beam inspection with improved training algorithm to extract fine contours from wafer hotspots, a hotspot aware OPC model can predict ADI hotspots with a higher capture rate as compared to main feature OPC model. Presumably, a hotspot-aware modeling flow based on LFoV images/contours not only benefits users by improving the capture rate of the lithography defects, but also brings the advantages to the failure mode analysis for the post-etch stage.
Contour-based process characterization, modeling, and control for semiconductor manufacturing
In this paper, we present the flow and results of contour-based process characterization, modeling and control used for semiconductor manufacturing. First, high-quality contours are extracted from large field of view (FOV) SEM images based on the improved Canny edge detection algorithm. Prior to the contour analysis steps, SEM image distortion correction is performed by using the loworder linear model. When there are repeating cells within one FOV, the N-sigma roughness band of the unit cell is calculated to show the stochastic process variation fingerprint. For SEM images collected from a focus-exposure matrix wafer, the contour-based process window analysis is performed to generate the depth-of-focus map for the full image, enabling precise detection of process window limiting locations. Finally, 3D compact resist models are calibrated by using both inner and outer contours from the same SEM images, which proves to be effective for the prediction of resist top loss related hotspots.
Contour metrology accuracy assessment using TMU analysis
In semiconductor fabs, top-down critical dimension scanning electron microscopes (CD-SEMs) are key enablers for metrology and process control, and in order to continue to address the increasing metrology requirements of the semiconductor industry, additional applications that exploit the potential of CD-SEM equipment have being developed. More recently, the uses of contour extraction have expanded to encompass other applications that serve newer technologies, such as ILT (Inverse Lithography Technology), photonics, and DSA (Directed Self Assembly), where the characteristics of the 2D curvilinear patterns are such that standard measurement solutions are no longer an option. The applications for these newer technologies have driven the need for highly accurate contour metrology algorithms. In turn, the need for standardization of what is meant by “accurate” contour metrology has emerged. This paper explores the concept of accuracy as it relates to contour metrology and applies the fundamental methodologies of TMU analysis to the problem of finding the best methods of determining contour metrology accuracy. During the development of this methodology, it was discovered that the application of TMU analysis to contour metrology reveals unexpected conceptual challenges that force a deeper understanding of accuracy and how it pertains to contour metrology. The paper resolves these challenges and proposes a standardization for contour metrology accuracy assessment. Results from analysis of the extracted contours validates the usefulness of the proposed accuracy methodology and establishes the quality of the accuracy of the extracted contours.
Roughness Metrology
icon_mobile_dropdown
SEM image denoising with unsupervised machine learning for better defect inspection and metrology
CD-SEM images inherently contain a significant level of noise. This is because a limited number of frames are used for averaging, which is critical to ensure throughput and minimize resist shrinkage. This noise level of SEM images may lead to false defect detections and erroneous metrology. Therefore, reducing noise in SEM images is of utmost importance. Both conventional noise filtering techniques and recent most discriminative deep-learning based denoising algorithms are restricted with certain limitations. The first enables the risk of loss of information content and the later mostly requires clean ground-truth or synthetic images to train with. In this paper, we have proposed an U-Net architecture based unsupervised machine learning approach for denoising CD-SEM images without the requirement of any such ground-truth or synthetic images in true sense. Also, we have analysed and validated our result using MetroLER, v2.2.5.0. library. We have compared the power spectral density (PSD) of both the original noisy and denoised images. The high frequency component related to noise is clearly affected, as expected, while the low frequency component, related to the actual morphology of the feature, is unaltered. This indicate that the information content of the denoised images was not degraded by the proposed denoising approach in comparison to other existing approaches.
Spectral analysis of line edge and line width roughness using wavelets
Although line edge and line width roughness (LER/LWR) have been key metrology challenges over the last 15 years, the advent of extreme-ultraviolet lithography (EUV) has increased the importance of its measurement and control. Lithographically printed features are now small enough that randomness in resist chemistry and in EUV photon during exposure results in noise in the patterned resist. This not only affects LER/LWR, but also defect density, including missing holes, shifted features, bridged lines and holes, and line shorts, among others. Well before these stochastic induced roughness variations, there have existed various techniques to analyze roughness. These include power spectral density algorithms, methods to account for instrument bias in the data, identify and filter noise, and specify measurement uncertainty. In this work, analysis methods to evaluate LER and LWR spatial wavelengths, including partitioning and filtering out instrument errors, such as noise and probe effects are presented. Our approach is based on wavelet-transform multiresolution analysis. One of the key advantages of wavelet transform over other signal processing techniques are its spatial-frequency localization and multi-scale view of the components of a profile or surface. This allows decomposing the data into different bands based on specific cutoffs and evaluating different approximations and surface-details at each cutoff band. A priori noise and probe information are used to determine and remove instrument-effects from the data, before calculating the unbiased roughness. The strength of this approach is that is it targeted only to specific spatial wavelengths that are associated with instrument noise or artifacts.
Evaluating SEM-based LER metrology using a metrological tilting-AFM
Ryosuke Kizu, Ichiko Misumi, Akiko Hirai, et al.
In this study, we developed a methodology to evaluate scanning electron microscopy (SEM)-based line edge roughness (LER) metrology. In particular, we used a metrological tilting atomic force microscopy (tilting-mAFM) as LER reference metrology. We analyzed the height-height correlation function (HHCF) of SEM line-edge profiles combining averaging and unbiased correction methods. The direct comparison of our method with tilting-mAFM enabled a precise evaluation of the SEM-based LER metrology. We demonstrated that a combination of unbiased HHCF and averaging methods with appropriate condition enabled relatively precise measurement of three roughness parameters. We observed that, for precise roughness evaluation, reducing noise in the line-edge profiles is important before performing the HHCF analysis and unbiased correction.
Determining the validity domain of roughness measurements as a function of CD-SEM acquisition conditions
In the effort of continuing improving patterning strategies for increasing circuit density while reducing dimensions, several challenges regarding patterning fidelity emerge. In recent years, stochastic effects had their relative importance increased, and therefore the need for closely monitoring those effects is also increasing [1]. Among other stochastic effects, within-feature roughness is significant as it can impact circuit electrical behavior, decreasing time and power performance, and even lead to failures. The workhorse method of the industry for measuring roughness is based on top-down CD-SEM (Critical Dimension Scanning Electron Microscope) image. In recent years, methods have been proposed as a way to improve and standardize the roughness measurement [2, 3]. Those methods rely on the obtention of the power spectral density (PSD) from the detected edges of the features in SEM images, in order to determine their roughness. However, one important aspect is the impact of the CD-SEM image acquisition conditions on the limitation of the observed PSD. As the acquisition parameters changes, different frequencies may be more or less observable in a SEM image, potentially leading to errors in the metrology evaluation [4]. The goal of this study is to first, present the impact of the CD-SEM image acquisition conditions in the roughness measurement, and, second, propose a method to determine the validity domain of the roughness measurements as a function of the acquisition conditions. The proposed method relies on a compact SEM image model. For each acquisition condition, the model is calibrated based on experimental SEM images, from several design samples. Using this calibrated model, synthetic SEM images are generated with a known sample, including its programmed roughness signature (input-PSD), which can be a white noise (defined by a constant PSD). The next step relies on a robust-to-noise edge detection algorithm [5], which is then used to compute the PSD by applying the method proposed in [4]. As the input-PSD is known, it is possible to compute the transfer function of the acquisition system [6], for each of the evaluated acquisition conditions. We call ‘limit-PSD’ the transfer function which may be considered as the signature of the acquisition conditions in the frequency domain. It can be seen as a low-pass filter and it defines the validity domain of the roughness measurements. For each input-PSD (simulated or experimental), if it is ‘below’ the limit-PSD (within the low-pass filter), the measurement is within the validity domain. If the resulting measurement reaches the limit-PSD, it is not possible to know if some roughness information was lost due to the acquisition conditions. Such relationship is illustrated in Figure 1, presenting one case where the input-PSD is inside the validity domain and a second case where the actual roughness is underestimated. Thanks to the produced results, the information of different acquisition conditions and detected roughness can be obtained and stored in what we call condition tables. For each acquisition condition, the limits of frequency range and roughness parameters range (ξ, H) of the measurable roughness are stored (Figure 2). These condition tables are very useful for assisting metrology specialists in choosing the most suitable acquisition conditions for the CD-SEM, if prior knowledge about the expected roughness is available, or in accounting for the validity domain of the roughness measurement. As a final step, the proposed method is applied to experimental data – an example of observed PSD over an experimental SEM image is shown in Figure 3. The experimental dataset is composed of SEM images acquired under different conditions (notably a large variation in landing energy – 300eV to 4000eV), and different sample materials (post-etch and post-litho samples). The obtained results confirmed the applicability of the proposed method in a real environment, and will be fully demonstrated in the final work.
Denoising sample-limited SEM images without clean data
Hairong Lei, Cho Teh, Liangjiang Yu, et al.
Over the past few years, noise2noise, noise2void, noise2self, and unsupervised deep-learning (DL) denoising techniques have achieved great success, particularly in scenarios where ground truth data is not available or is difficult to obtain. For semiconductor SEM images, ground truth or clean target images with lower noise levels can be obtained by averaging hundreds of frames at the same wafer location, but it is expensive and can result in physical damage to the wafer. This paper’s scope is to denoise SEM images without clean target images and with limited image counts. Inspired by noise2noise, we proposed an additive noise algorithm and DL U-net. We achieved good denoising performance using a limited number of noisy SEM images, without the clean ground truth images. We proposed the “denoise2next” and “denoise2best”. We compared generative adversarial network(GAN) generated images and Additive noise images for data augmentation. This paper further quantified the impact of image noise level, pattern diversity, and continuous (aka transfer) learning. The data sets used in the work include both line/space and logic pattern.
Diagnosing and removing CD-SEM metrology artifacts
Chris A. Mack, Gian F Lorusso, Christie Delvaux
Background: Random and systematic errors found in CD-SEM tools affect the measurement of roughness in dramatically different ways than the measurement of the average critical dimension. Aim: In order to increase the accuracy of roughness measurements, monitor the health of CD-SEM tools, and improve CD-SEM tool matching, it is important to measure and remove the impact of random and systematic errors from the measurements. Approach: Several different CD-SEM tool systematic errors have been identified, but the scan error signature in particular was found to be very relevant. This signature is measured using the mean contour of many properly sampled features and can be used as the target edge for roughness calculations in order to remove this error. Results: Using six different CD-SEM tools and a large data set of across-wafer, across-scanner-field measurements of the same wafer, each CD-SEM tool was found to have a unique CD-SEM signature. Subtracting off this error signature significantly improved the accuracy of the roughness measurements and the CD-SEM tool-to-tool matching. It also identified one tool as being problematic, requiring further attention. Conclusions: Measurement and characterization of the CD-SEM scan error is a powerful tool, along with measurement and removal of random edge detection noise, for monitoring CD-SEM tool health, matching different CD-SEM tools, and improving the accuracy of line-edge and linewidth roughness measurements.
Pixelization effect in SEM images: investigating the effect of the selected pixel size on LER measurement
In this paper, we focus on the across-edges pixelization effects on LER measurement accuracy. The pixelization across edges rounds the detected edge position according to pixel size and is expected to impact the accuracy of LER measurements (rms, correlation length, PSD curve). Given the similar size of targeted LER values in IRDS and of the pixel size used in SEM measurements, these effects are expected to gain an increasing interest and therefore to be worth studying. Our investigation is utilized with the aid of synthesized images (characterized by complete control of roughness and image parameters), with the final aim being the exploration of the link between the selected pixel size of the SEM image and the observed deviation of the measured rms (rms_m) from the true rms value (rms_t). More specifically, the goal is to find the role of the ratio of pixel size to rms_t in the measurement of rms and how we can predict it.
Process Control
icon_mobile_dropdown
Identifying contributors to overlay variability using a model-less analysis method
Overlay (OVL) has become a critical process control and metrology challenge for current and future process nodes of logic as well as memory devices. Especially with the advent of EUV lithography and the accompanying use of two lithographical techniques (EUV and 193nm immersion) for patterning of critical layers, there is an increased need for identifying variability and its root cause in the overlay signatures. Current variability analysis uses pre-defined models mostly related to describe variability and allocating them in standard categories. These models are usually tied to the applicable exposure capabilities. As the EUV to immersion layers undergo exposure with vastly different conditions, there is a need to analyze OVL without associating to specific models. In this paper, we report on a novel model-less method for analyzing overlay data containing complex intra-field signatures. The method can identify and quantify intra-field signatures variation within a wafer as well as across wafers. These signatures enable root cause analysis of contributors to overlay variability. We applied the method on data sets of long-term overlay data of an EUV to a 193-immersion layer. While, several applications of the method with respect to identifying exposure conditions are demonstrated specific to the EUV to immersion layer, it should be noted that the method is universally applicable to any imaging wavelength for current and reference layer.
Optimizing focus and dose process windows for robust process control using a multi-feature analysis
Advancing technology nodes in DRAM continues to drive the reduction of on-product overlay (OV) budget. This gives rise to the need for OV metrology with greater accuracy. However, the ever increasing process complexity brings additional challenges related to metrology target deformation, which could contribute to a metrology error. Typically, an accurate OV measurement involves several engineering cycles for target and recipe optimization. In particular, process optimization in either technology development (TD) phase or high volume manufacturing (HVM) phase might influence metrology performance, which requires re-optimization. Therefore, a comprehensive solution providing accuracy and process robustness hereby minimizing the cycle time is highly desirable. In this work, we report multi-wavelength μDBO enhanced with accuracy aware pixel selection as a solution for robust OV measurement against process changes as well as improved accuracy in HVM. Accuracy aware pixel selection is capable of tackling intra-target processing variations and is established on a multi-wavelength algorithm with immunity to target asymmetry impact. DRAM use cases in FEOL critical layers will be discussed in this paper. Superior robustness and accuracy will be demonstrated together with improved on-product OV performance, promising a process of record metrology solution in specific applications throughout the TD and HVM.
Privacy preserving amalgamated machine learning for process control
Wilfried Verachtert, Thomas J. Ashby, Imen Chakroun, et al.
Further application of machine learning is important for the future development of semiconductor fabrication. Machine learning relies on access to large, detailed datasets. When different parts of the data are owned by different companies who do not wish to pool their data due to commercial sensitivity concerns, the benefits of machine learning can be limited resulting in reduced manufacturing performance. Imec has developed Privacy-preserving Amalgamated Machine Learning (PAML) to overcome this problem and achieve predictive performance close to models built on pooled data, without compromising sensitive raw data. In this paper we give a concrete example based on an in-house overlay metrology dataset where we apply a PAML enhanced version of a tree regression model, and quantify the performance benefit compared to separate models that don’t have access to all of the data.
Hybrid overlay control solution with CDSEM and optical metrology
With continued innovation of semiconductor processes, overlay control has become the most critical and challenging part. Advanced technology nodes require even tighter lithography overlay control, and therefore, high-order process corrections for inter-field (HOPC) and for intra-field (iHOPC) are adopted as a common solution to meet on-product overlay (OPO) specifications. High order corrections often require more measurement shots and more targets in field, which makes optical overlay metrology on scribe line targets the workhorse of overlay control due to its high throughput and low cost-of-ownership. This leads to the additional challenge that the measurement location also affects the accuracy of generated overlay corrections. For example, it is well known that there may be a spatially dependent offset between overlay on targets and the device. This is commonly called a non-zero offset (NZO) [1], which is a comparison between device overlay measured with the CD SEM after etching (AEI) and optical overlay measured on targets after litho (ADI). In addition, the position of targets could impact the validity of corrections modeled using these targets. The targets could be unevenly distributed in field, some targets huddle at an area, while not a single target appears at others. Hence, this kind of target layout has risks generating problematic field corrections at areas without enough targets. In this paper, we propose a hybrid method utilizing CDSEM overlay to fill in the position where optical overlay targets are deficient. With iHOPC model terms generated by optical overlay targets only, CDSEM metrology results from real devices reveal significantly larger overlay in areas with no targets. By means of this method, the mis-correction at locations where optical overlay targets are deficient is significantly restored, and consequently the OPO mean+3sigma is suppressed to <4nm. Furthermore, an inline control solution is proposed and implemented with the latest generation 5D Analyzer.
Novel methods for stitching and overlay corrections
Stitching process is a widely adopted technique in manufacturing of image sensors to overcome reticle size limitations. In order to accomplish successful stitching, both standard overlay target data and stitching data from stitching marks need to be monitored and controlled. Large overlay will result in faulty electric connections between layers, and therefore result in chip failure. Similarly, large stitching also could cause the poor contact between neighboring sub-chips, and consequently result in device malfunction. In this article, we propose three novel methods to enable the correction per exposure (CPE) model for stitching and overlay control. With the implementation of these methods, the stitching and overlay residual are significantly improved compared with current solutions.
Scatterometry
icon_mobile_dropdown
A breakthrough on throughput and accuracy limitation in ellipsometry using self-interference holographic analysis
An innovative self-interferometric pupil ellipsometry (SIPE) technique has been demonstrated to overcome the accuracy and throughput limitations raised from the conventional spectroscopic ellipsometry (SE) tools to precisely measure the optical critical dimensions (OCD) in the advanced semiconductor devices. The proposed SIPE technique will be extremely powerful, because key ellipsometric parameters, Ψ and Δ, from all possible incident angles can be obtained simultaneously from the single measurement, while the conventional SE technique needs to collect several hundreds of measurements to get the identical information. By employing a Nomarski prism, one can angularly separate the reflected light from the wafer into two orthogonally polarized lights. Then, the self-interference pupil ellipsometer could interfere those two beams without an additional reference beam path. The interfered fringe includes rich ellipsometric information at incident angles from -70º to 70º with 0-360º azimuthal directions, where those Ψ and Δ information can be extracted by the novel holographic algorithm we proposed. To verify the usefulness of SIPE system and the algorithms, both experimental and theoretical validation have been performed for the patterned wafers. In short, the proposed system and algorithms, which are completely new concept, show a capability to overcome current metrology challenges by breaking multiple parameter correlations between various structural parameters, eventually resulting in the improved metrology sensitivity and precision. Based on the results presented here, we strongly believe the SIPE is a promising metrology solution that can be eventually replacing the traditional OCD tools.
Improving data-driven OCD uncertainties from Gaussian process regression
The continued extensibility of optical critical dimension (OCD) metrology relies not only upon experimental advances but also upon improved, rigorous data analysis. While often approached using traditional non-linear regression (NLR), the inverse problem inherent to OCD metrology can also be addressed using machine learning (ML) techniques. In our recent comparison between two ML approaches, Gaussian process regression (GPR) and NLR enhanced with radial basis function (RBF) interpolation, these methods could yield similar parametric values for a sufficient number of training points, but the GPR yielded much larger uncertainties. Here, we refine and explore further GPR through the addition and explanation of additional parameters, often better capturing crucial physical considerations. By identifying these key fitting attributes, reductions in both parametric uncertainties and in parametric bias are realized. Industrial applicability of GPR and similar ML approaches is discussed with respect to its computational costs.
Machine learning aided process control: critical dimension uniformity control of etching process in 1z nm DRAM
Conventional semiconductor etching process control has been performed by separated steps: process, metrology, and feedback control. Uniformity of structures such as Critical Dimension (CD) is an important factor in determining completeness of etching process. To achieve better uniformity, several feedback control has been performed. However, it is difficult to give feedback to the process after metrology due to the lack of process knowledge. In this study, we propose a machine learning technique that can create process control commands from the measured structure using a miniaturized Integrated Metrology (IM) of Spectroscopic Ellipsometery (SE) form. And it is possible to learn the physical analysis through machine learning without introducing a physical analysis method. The proposed analysis consists of two machine learning part: the first neural network for CD metrology, and second network for command generation. The first neural network takes a spectrum sampled at 2048 wavelengths obtained from IM as an input, and outputs CDs of structures. Finally, the second artificial neural network takes a changes of temperatures in a wafer and outputs the control commands of powers. As a result, we have improved the CD range of poly mask in a wafer from 1.69 nm to 1.36 nm.
Latent image characterization by spectroscopic reflectometry in the extreme ultraviolet
Sophia Schröder, Lukas Bahrenberg, Bernhard Lüttgenau, et al.
The authors present latent image characterization in photoresists by means of extreme ultraviolet (EUV) spectroscopic reflectometry. The optical constants of photoresists before and after exposure are measured in the EUV spectral range. Latent images are investigated in the form of periodic line gratings. The investigation is performed by the analysis of spectroscopic reflectance curves in the wavelength range from 5 nm to 20 nm at grazing incidence angles. Through an analysis of the reflectance curves based on rigorous electromagnetic modeling, a characterization of parameters of interest of the latent image is evaluated. This includes the latent image profile, surface topography and stochastic-related parameters such as line edge roughness.
Unsupervised density-based machine learning for abnormal leveling signatures detection
The Semiconductor industry relies on the metrology to keep up with a highly competitive production environment and technology ramp up. To reduce metrology costs without degrading quality we propose to use sensors data such as scanner leveling data as a new way to detect maverick lots and wafers enabling a smarter measurement sampling scheme. To achieve this, data preparation and data cleaning with Zernike polynomials method is required. Then the pre-processed data are used to feed an unsupervised density based machine learning algorithm (DBSCAN) that can detect outliers as an human expert would. Finally, a solution (Random Forest Discriminant Analysis) for root cause detection of abnormal fingerprints is tested in this paper. A method working on other use cases (Partial Least Square Discriminant Analysis) is also used for result crossing.
Ellipsometric critical dimension metrology employing mid-infrared wavelengths for high-aspect-ratio channel hole module etch processes
G. Andrew Antonelli, Nick Keller, Troy Ribaudo, et al.
A novel mid-infrared critical dimension (IRCD) metrology has been developed on a platform suitable for fab production. Compared to traditional optical critical dimension (OCD) technology based on ultraviolet, visible, and near-IR light, the IRCD system exploits unique optical properties of common semiconductor fab materials in the mid-infrared to enable accurate measurements of high-aspect-ratio etched features. In this paper, we will show two examples of critical dry etch steps in 3D NAND channel formation module of an advanced node that require nondestructive process control: (1) channel hole active area etch and (2) amorphous carbon hardmask etch. In the first example, we take advantage of the absorption bands of silicon dioxide and silicon nitride to get accurate CD measurements at different depths, resulting in high-fidelity z-profile metrology of the channel – key to guiding process development and accelerated learning for 3D NAND device manufacturing. In the second example, the most common amorphous carbon hardmask materials for advanced 3D NAND nodes are opaque in the traditional OCD wavelength range; however, in the mid-infrared, there is light penetration and hence spectral sensitivity to dimensional parameters including sub-surface features. We show successful detection of intentional process skews and as well accurate bottom CD measurements of the hardmask.
Methods to overcome limited labeled data sets in machine learning-based optical critical dimension metrology
Franklin J. Wong, Yudong Hao, Wenmei Ming, et al.
With the aggressive scaling of semiconductor devices, the increasing complexity of device structure coupled with tighter metrology error budget has driven up Optical Critical Dimension (OCD) time to solution to a critical point. Machine Learning (ML), thanks to its extremely fast turnaround, has been successfully applied in OCD metrology as an alternative solution to the conventional physical modeling. However, expensive and limited reference data or labeled data set necessary for ML to learn from often leads to under- or overlearning, limiting its wide adoption. In this paper, we explore techniques that utilize process information to supplement reference data and synergizing physical modeling with ML to prevent under- or overlearning. These techniques have been demonstrated to help overcome the constraint of limited reference data with use cases in challenging OCD metrology for advanced semiconductor nodes.
Nanosheet and Nanowire
icon_mobile_dropdown
Spectroscopy: a new route towards critical-dimension metrology of the cavity etch of nanosheet transistors
J. Bogdanowicz, Y. Oniki, K. Kenis, et al.
Nanosheet Field-Effect Transistors (FETs) are candidates to replace today’s finFETs as they offer both an enhanced electrostatic control and a reduced footprint. The processing of these devices involves the selective lateral etching, also called cavity etch, of the SiGe layers of a vertical Si/SiGe superlattice, to isolate the future vertically stacked Si channels. In this work, we evaluate the capabilities of various conventional Critical Dimension (CD) and alternative spectroscopic techniques for this challenging measurement of a buried CD. We conclude that Raman and energy-dispersive X-ray spectroscopies are very promising techniques for fast inline cavity depth measurements.
Nanosheet metrology opportunities for technology readiness
Over the past several years, stacked Nanosheet Gate-All-Around (GAA) transistors captured the focus of the semiconductor industry and has been identified as the new lead architecture to continue LOGIC CMOS scaling beyond-5nm node. The fabrication of GAA devices requires new specific integration modules. From very early processing points, these structures require complex metrology to fully characterize the three-dimensional parameter set. As the technology continues through research and development cycles and looks to transition to manufacturing, there are many opportunities and challenges remaining for inline metrology. Especially valuable are measurement techniques which are non-destructive, fast, and provide multi-dimensional feedback, where reducing dependencies on offline techniques has a direct impact to the frequency of cycles of learning. More than previous nodes, then, this node may be when some of these offline techniques jump from the lab to the fab, as certain critical measurements need to be monitored realtime. Thanks to the compute revolution this very industry enabled, machine learning has begun to permeate inline disposition, and hybrid metrology systems continue to advance. Metrology solutions and methodologies developed for prior technologies will also still have a large role in the characterization of these structures, as effects such as line edge roughness (LER), pitchwalk, and defectivity continue to be managed. This paper reviews related prior studies and advocates for future metrology development that ensures nanosheet technology has the inline data necessary for success.
Scatterometry of nanowire/nanosheet FETs for advanced technology nodes
Here, we report the measurement of the dielectric spacer etch process for nanowire and nanosheet FET processes. A previously described Nanowire Test Structure (NWTS) was used for this study.[1, 2, 3] This structure has alternating Si/Si1-xGex/…/Si multilayers. Subsequent to the selective etching of the Si1-xGex layers (cavity etch), a silicon nitride (SiN) dielectric layer was deposited on the NWTS. Here we report on the use of Mueller Matrix Spectroscopic Ellipsometry based Scatterometry (MMSE) to measure the thickness of the SiN dielectric layer after deposition and after trim etch steps. Four different amounts of trim etch were characterized.
In-line Raman spectroscopy for stacked nanosheet device manufacturing
D. Schmidt, C. Durfee, J. Li, et al.
In-line Raman spectroscopy for compositional and strain metrology throughout front-end-of-line manufacturing of next generation stacked gate-all-around nanosheet field-effect transistors is presented. Thin and alternating layers of fully strained pseudomorphic Si(1-x)Gex and Si were grown epitaxially on a Si substrate and subsequently patterned. Intentional strain variations were introduced by changing the Ge content (x = 0.25, 0,35, 0.50). Polarization-dependent in-line Raman spectroscopy was employed to characterize and quantify the strain evolution of Si and Si(1-x)Gex nanosheets throughout front-end-of-line processing by focusing on the analysis of Si-Si and Si-Ge optical phonon modes. To evaluate the accuracy of the Raman metrology results, strain reference data were acquired by non-destructive high-resolution x-ray diffraction and from destructive lattice deformation maps using precession electron diffraction. It was found that the germanium-alloy composition as well as Si and Si(1-x)Gex strain obtained by Raman spectroscopy are in excellent agreement with reference metrology and follow trends of previously published simulations.
OCD enhanced: implementation and validation of spectral interferometry for nanosheet inner spacer indentation
In this work, the novel enhancement to multichannel scatterometry data collection, Spectral Interferometry, is introduced and discussed. The Spectral Interferometry technology adds unique spectroscopic data by providing absolute phase information. This enhances metrology performance by improving sensitivity to weak target parameters and reducing parameter correlations. Spectral Interferometry enhanced OCD capabilities were demonstrated for one of the most critical and challenging applications of gate-all-around nanosheet device manufacturing: lateral etching of SiGe nanosheet layers to form inner spacer indentations. The inner spacer protects the channel from the source/drain regions during channel release and defines the gate length of the device. Additionally, a methodology is presented, which enables reliable and reproducible manufacturing of reference samples with engineered sheet-specific indent variations at nominal etch processing. Such samples are ideal candidates for evaluating metrology solutions with minimal destructive reference metrology costs. Two strategies, single parameter and sheet-specific indent monitoring are discussed, and it was found that the addition of spectroscopic information acquired by Spectral Interferometry improved both optical metrology solutions. In addition to improving the match to references for single parameter indent monitoring, excellent sheet-specific indent results can be delivered
Edge Placement Error
icon_mobile_dropdown
Edge placement error wafer mapping and investigation for improvement in advanced DRAM node
Kuan-Ming Chen, Wolfgang Henke, Ji-Hoon Jung, et al.
In this paper, budget characterization and wafer mapping of the Edge Placement Error (EPE) is studied to manage and improve pattern defects with a use case selected from SK Hynix’s most advanced DRAM 1x nm product. To quantify EPE, CD and overlay were measured at the multiple process steps and then combined for the EPE reconstruction. Massive metrology was used to capture extreme statistics and fingerprint across the wafer. An EPE budget breakdown was performed to identify main contributors and their variations. The end result shows EPEmax is mostly driven by local CD and overlay components while EPE variation is dominated by overlay and global CD components. Beyond EPE budget, a novel EPE wafer mapping methodology is introduced to visualize the temporal and spatial EPE performance which captures variation not seen from CD and overlay. This enables root-cause analysis of the pattern defects, and provides a foundation towards a better process monitoring solution. For EPE improvement, serial CD and overlay optimization simulation was performed to verify opportunities for reduction of the EPE and variation using the available ASML applications. The potential improvement for this use-case was confirmed to be 4.5% compared to baseline performance.
Improvement of EPE measurement accuracy on ADI wafer, the method of using machine learning trained with CAD
Yosuke Okamoto, Shinichi Nakazawa, Akinori Kawamura, et al.
The precise metrology for edge placement error (EPE) is required especially in EUV era. Last year, we proposed new contour extraction algorithm using machine learning and verified the robustness to SEM noise on AEI pattern. In this study, we suggest the method for contour extraction on ADI pattern and improve the EPE measurement accuracy. It is known that the gray-level signal profile across the pattern edge on SEM image is varied depending on e-beam scan direction angle to the pattern edge, and especially the contrast of parallel pattern edge to scan direction is low and unstable. In addition, in case of ADI, the gray-level of SEM image are varied and have the shading because of charge effect caused by e-beam exposure on the pattern. Therefore, the contour extraction on ADI pattern just using simple feature value or some of thresholds is usually inaccurate. However, the precise contour extraction independent on e-beam scan direction is required strongly for 2D pattern inspection and metrology. In this paper, we will propose the novel contour extraction method of precise EPE metrology on ADI regardless of the ebeam scanning direction to the pattern edge. We use machine learning to extract contour, splitted training data according to target edge direction, and trained contour extraction model. This model is expected to learn not only the gray-level variation but also the drift of landing position caused by the charge effect on ADI. We captured SEM images on the ADI wafer with several scan direction and compared between the contours extracted by the conventional method and extracted by the proposal method, then the improvement of EPE measurement accuracy at every pattern direction on ADI is verified.
The importance of in-die sampling using E-beam solution on yield improvement
As technology progress with scaling to meet the market requirements, the patterning characterization of dense features suffers a significant challenge for current optic tools, and measurement accuracy will be an important index and great challenge as well. Patterning can mostly be characterized with index of overlay (OVL) and CDU (critical dimension uniformity) measurement. When you break down the budget of the overlay error, one of the challenges is a gap of measurement results between scribe and device, where provides improper information to be used in overlay correction and causes process anomaly (excursion) detection, resulting in a low yield at the end of the production process. An eBeam tool, using high electron landing energies while utilizing the ElluminatorTM technology[1] for improvement backscattered electrons (BSE) imaging efficiency, can be utilized to directly capture OVL performance of device unit in-die, including local and global level, due to BSE function of eBeam tool[2]. In this paper, we demonstrate overlay measurement of M0 to Poly line in device for advanced logic node (only OVL X measurement), obtaining Overlay gap between in-die and scribe line to capture the actual behavior of device unit in-die. Massive OVL data is measured using eBeam tool with fast speed and high resolution, and local OVL results have been analyzed in detail. We’ve quantified what is the impact of overlay correction by different measurement ways whether it depends on optical tool or eBeam tool and benefits yield improvement.
Contour-based metrology for assessment of edge placement error and its decomposition into global/local CD uniformity and LELE intralayer overlay
Wenzhan Zhou, Fang Wei, Yu Zhang, et al.
Edge placement error (EPE) analysis, which combines pattern variation data from single litho-process steps with overlay data from subsequent litho-process steps, has been well established as a key methodology to characterize the performance of complex semiconductor manufacturing processes. As critical dimensions shrink in new semiconductor technologies, process margins become tighter, and characterizing and monitoring EPE budgets becomes more important than ever to assess and maintain in-line process performance and yield. In this paper, we present SEM image contour-based EPE analysis and budget generation for a BEOL multi-patterning (LELE) layer. SEM contour analysis was previously shown to be a suitable method for pattern variability characterization, with the capability to capture not only pattern size, but also shape and local stochastic placement variations, and to provide statistical overlay margin estimates between separate device layers. In the current work, we also show that for a LELE process, contour analysis provides local overlay measurements and all inputs needed to generate the complete EPE budget breakdown. Multiple wafers from a device in production were provided after processing the second etch step of a metal layer LELE process. We acquire large field-of-view SEM images with a high-throughput e-beam tool (HMI eP5), sampled within die, across exposure field and across wafer in order to enable analysis of variability into global and local components. Pattern contours are extracted from individual SEM images, and contours are ‘stacked’ to identify specific locations of largest variability or smallest margin. While the images contain patterns from both processing steps, these can be uniquely distinguished after die-to-database alignment and labeled by mask ID, here 1st and 2nd litho-etch layers, respectively. In addition to size, shape and stochastic placement variations, we perform center-of-gravity analysis between patterns on the 1st and 2nd litho-etch layers. The latter reveals local on-device overlay variations that can be mapped across the measured wafers. The contour analysis therefore provides all information required for a thorough EPE budget breakdown, i.e. global CDU and local CDU for the most critical cutline locations, as well as overlay. Figure 1 shows the breakdown for one particular point of interest. We perform EPE budget analysis for multiple wafers, which can highlight wafer-to-wafer variations. This is a first step toward process monitoring, which would not only highlight process drifts, but also distinguish main contributors in order to aid in trouble shooting. KEYWORDS: pattern variability, pattern fidelity, contour analysis, edge placement error, holistic lithography, SEM metrology
EPE budget analysis and margin co-optimization on the multiple critical on-device features in a single image for yield enhancement
Yaniv Abramovitz, Boo-Hyun Ham, Sangho Jo, et al.
The advanced logic node is continuously shrinking toward 1nm and EUV lithography is one of the main technical drivers to reach better patterning resolution combined with reduced process steps. Along with this design rule shrink, the patterning control with the metric of Edge Placement Error (EPE), of which main contributors are CD and overlay error, becomes more and more critical. EPE-aware process margin studies1 are of growing interest and focus on advanced nodes. However, the studies are mostly focused on single type of device feature or hot spot (HS) and the EPE budget breakdown is analyzed through the data set from different process steps, metrology tools (SEM, OM), and measurement targets (device structure, scribe line target) which inherently contains the problem of data integrity and proximity2. Due to the complicated and localized process loading of the various pattern on logic device the EPE analysis is required on diverse critical device features (HS’s) which shows different fingerprint of EPE due to the process loadings and multi-patterning effects among unit process characteristics. To accomplish this multiple HS EPE-aware analysis, the measurement must be taken at various real device pattern. This requirement leads to data collection with an e-beam tool, using high electron landing energies while utilizing the ElluminatorTM technology for improvement backscattered electrons (BSE) imaging efficiency. This is the unique and right approach to directly capture CD and overlay simultaneously in die, on device, multiple HS’s in local and global level. In this paper, we will demonstrate the EPE budget analysis on various on device HS which results in the different process fingerprint due to the local process loading effect. The all-in-one and on device pattern measurement is the essential prerequisite capability on this study. The data is captured from the high-quality e-beam image which delivers a CD-SEM comparable precision, low TMU overlay metrology on real device. We demonstrate that with the multi HS EPE-aware analysis from the all-in-one on device data, the balanced EPE margin is achieved through the co-optimized correctable with the weighted factors among HS’s to increase yield at end of line. Keywords: CD, Overlay, e-beam, EPE, accuracy, on device, HS, yield
Excursion prevention by edge placement error reduction using photomask tuning
Rolf Seltmann, Tino Hertzsch, Matthias Ruhm, et al.
In our paper we will extend the previously introduced “Excursion Prevention” concept for CDU improvement [1] toward the reduction of Edge Placement Errors by additionally correcting intrafield overlay errors using the ZEISS ForTune system for a critical 22nm contact to gate application. We will discuss EPE budget and compare intra-field, inter-field and local variations. In particular, it will be shown how the interaction of remaining systematic intra- and inter-field errors can create local EPE hotspots that are prone to result in patterning failures. By convoluting these systematic EPE errors with random across wafer and local EPE errors, we will predict patterning failure maps across wafer, where the EPE hotspots manifest in local failures and thus yield stripes. Finally we will discuss how the reduction of intra-field overlay and CDU by using ForTune mask tuning is able to dramatically reduce the EPE hotspots and thus helps to prevent pattering failure and yield striping. [1] Rolf Seltmann*a, Aravind Narayana Samyb, Thomas Thammb, Ofir Sharonic, Yael Sufrinc, Avi Cohenc, Thomas Scherueblc : Improving Chip Performance by Photomask Tuning: Ultimate intra-field CD control as a major part of an overall excursion prevention strategy, Proc. SPIE. 11148, Photomask Technology 2019
Overlay Accuracy II
icon_mobile_dropdown
Fundamental understanding of the interplay between target and sensor brings diffraction based overlay to the next level of accuracy
Simon Mathijssen, Tim Davis, Arie den Boef, et al.
Driven by the increasing demand for a better on-product overlay control, the metrology specifications continue to tighten when nodes advance. Metrology accuracy budgets are reaching levels close to 1 angstrom. To obey these stringent requirements on metrology accuracy, the understanding of the interplay between the metrology target and metrology tool is of paramount importance to advance diffraction based overlay (DBO) to the next level of accuracy. In this contribution we will first present a model for the signal formation of DBO. We will show how overlay is translated into a measurable intensity asymmetry using biased gratings. When the metrology target is symmetric, the accuracy of the measurement is limited by photon shot noise only. We will show how an asymmetric target deformation leads to an overlay ambiguity that deteriorates the accuracy of the overlay metrology well beyond the fundamental photon shot noise limit. To recover from the effects of asymmetric target deformation, we will show that additional information is needed on top of the traditional single wavelength overlay reading on one target. When the target deformation is small, that additional information can, for example, come from multiple wavelengths. For large asymmetric target deformations causing a significant center-of-gravity shift, a plurality of targets or a description of the stack is needed. These specific solution directions will be discussed in detail. In conclusion, in this paper we will dive into the fundamentals of diffraction based overlay, where we explore the physics of the signal formation in diffraction based overlay in the presence of asymmetric target deformation. We will show that a better understanding of the signal formation helps to develop methods that advance diffraction based overlay metrology to the next level of accuracy, which is needed to fulfill the increasing demand for a better on product overlay control.
Self-referenced and self-calibrated MoiréOVL target design and applications
Cheuk Wun Wong, Neelima Rathi, Taher Kagalwala, et al.
This paper presents a new overlay metrology target design and scheme referred as MoiréOVLTM. It utilizes Moiré patterns of two overlapping gratings to amplify the kernel response to overlay misalignment and thereby has the potential to enhance kernel sensitivity, detectability and measurement accuracy. A Self-referenced (SR) MoiréOVL design scheme, which enables MoiréOVL to be measured with existing image-based overlay tools is proposed and evaluated on a contact layer. This paper demonstrated the feasibility of SR-MoiréOVL on existing IBO tools. When comparing to the reference SEM-based overlay, a magnification factor of 6.9X with an R2 of 0.96 and a calibrated intercept of 0.34nm was observed on wafer. Comparison between MoiréOVL and POR IBO on TIS, residuals, precision and TMU is presented. Lastly we present the idea of a Self-calibrated (SC) MoiréOVL scheme to calibrate the magnification factor on the fly during measurement for enhanced usability.
Statistical process optimization method for metrology equipment
We developed a statistical method that can be applied to overlay metrology tools to improve performance and time-to-results (TTR) of multi-cycle optimization based on the brute force method. First, we evaluated full response surfaces for each combination of the discrete equipment settings and calculated desirability scores using a normalization function. Second, we combined gradient optimization techniques and response surface methodologies to find the important local maxima (center of the islands in quadratic contour) and stationary response points. Once all the stationary response points have been identified, users can choose to rank the solutions by quality or can choose to use analysis of variance (ANOVA) methods to determine which main effects and/or interactions are of interest. Two separate layers were evaluated and compared to the process of reference (POR) brute force method of optimization. Results showed that the best residuals values from recipes optimized using 1-cycle SPOC-based automatic recipe optimization (ARO) and ARO based on the 2- cycle Brute-Force strategy were comparable to known residuals values from the POR recipes. Moreover, SPOC-based ARO was performed with a TTR of under 2 hours, while a 2-cycle Brute-Force ARO typically took 6~ 20 hours depending on specific configurations. The vast reduction in optimization time is primarily attributed to the elimination of multi-cycle refinement, whose data collection dominated the previously observed TTR. In conclusion, we demonstrated the ability to reduce time to solution by a factor of 3 while maintaining or improving on overlay residuals compared to existing brute force methodologies.
Signal weighted overlay optimizer for scatterometry metrology
A new algorithm called Single Wavelength Overlay Optimizer (SWOOP) enhances the performance of single-wavelength optical diffraction-based overlay metrology. SWOOP combines statistical learning with a physical model to advance the performance of single-wavelength measurements to that of multi-wavelength measurements. This is achieved by making a set of multi-wavelength measurements on the first wafer during a train phase and extracting the characteristic signature of the overlay inaccuracy at the pupil plane. This inaccuracy signature is then evaluated and removed in real time for single-wavelength measurements, resulting in improved accuracy and robustness to process variation without compromising throughput.
Moiré effect-based overlay target design for OPO improvements
Dieter Van den Heuvel, Philippe Leray, Eitan Hajaj, et al.
As design nodes of advanced semiconductor chips shrink, reduction in on-product overlay (OPO) budget becomes more critical to achieving higher yield. Imaging-based overlay (IBO) targets usually consist of periodic patterns where their pitches are resolvable with visible light microscopy. The difference between the feature dimensions of the device and the optical target is growing as device design nodes shrink. To make the optical target emulate the device as much as possible, the target’s feature periodicity is reduced. Using this approach, the process impact on the device is simulated on the overlay target which enables a more accurate measurement on grid (target) in terms of OPO matching. To further optimize IBO performance, a new moiré effect based robust Advanced Imaging Mode (rAIM™) target design was developed. This rAIM IBO target is implemented using significantly smaller pitches compared to the standard AIM® target, resulting in a more device-like target design. In this paper we investigate the benefits of the optical improvement, manifested as the target gain, and the process compatibility benefits to improve the target accuracy, robustness, and measurability to meet overlay (OVL) basic performance requirements, such as total measurement uncertainty (TMU).
Novel diffraction-based overlay metrology utilizing phase-based overlay for improved robustness
Masazumi Matsunobu, Toshiharu Nishiyama, Michio Inoue, et al.
The current state of the art ADI overlay metrology relies on multi-wavelength uDBO techniques. Combining the wavelengths results in better robustness against process effects like process induced grating asymmetries. Overlay information is extracted in the image plane by determining the intensity asymmetry in the 1st order diffraction signals of two grating pairs with an intentional shift (bias). In this paper we discuss a next evolution in DBO targets where a target is created with multiple biases. These so called cDBO (continuous bias DBO) targets have a slightly different pitch between top and bottom grating, which has the effect of having a different bias values along the grating length and are complimentary to the uDBO technology. Where for the uDBO target, the diffraction results in a uniform Intensity pattern that carries the Overlay signal, for cDBO, an oscillating intensity pattern occurs, and the Overlay information is now captured in the phase of that pattern. Phase-based Overlay has an improved, intrinsic robustness over intensity-based overlay and can reduce the need for multi-wavelength techniques in several cases. Results on memory technology wafers confirm that the swing-curve (through-wavelength) behavior is indeed more stable for phase-based DBO target and that for accurate Overlay, this target can be qualified with a single wavelength recipe (compared to the uDBO dual wavelength recipe). In this paper, both initial results on a Micron feasibility wafer will be shown as well as demonstrated capability in a production environment.
Metrology and Inspection for the EUV Era
icon_mobile_dropdown
The unavoidable renaissance of electron metrology in the age of high NA EUV
Things are drastically changing in the field of metrology. The main reason for that is related to the daunting specification requirements for metrology imposed by high Numerical Aperture Extreme Ultraviolet Lithography (high NA EUVL). We observe a variety of new generation e-beam tools proliferating in imec unique ecosystem, from in-line Transmission Electron Microscope (TEM) to Voltage Contrast (VC) overlay tools, from Die To Database (D2DB) large area Scanning Electron Microscope (SEM) to high-voltage SEM, from Artificial Intelligence (AI)-based inspection tools to massive data acquisition e-beam system. We are facing a renaissance of e-beam metrology. In this paper, we are going to describe the challenges as well as the latest evolutionary developments of e-beam metrology in the semiconductor industry.
Defect characterization of 28 nm pitch EUV single patterning structures for iN5 node
Kaushik Sah, Sayantan Das, Andrew Cross, et al.
As we strive toward smaller and smaller pitches to enable device scaling, thorough defect characterization at wafer scale continues its importance during the early phases of process optimization. In this paper, we describe experiments and show characterization results for capturing stochastic defects across various test structures of 28 nm pitch devices that have been patterned using single exposure EUV lithography. The objective of this work is to quantify detection sensitivity of critical defect types on multiple test structures, and study wafer and die level signatures for some of the types. We will employ various, complementary optical and e-beam inspection and review techniques. Further, new methods to increase sensitivity of optical inspection after litho are also discussed.
Massive e-beam metrology and inspection for analysis of EUV stochastic defect
Seulki Kang, Kotaro Maruyama, Yuichiro Yamazaki, et al.
In the extreme ultraviolet (EUV) lithography process, stochastic defects are randomly generated and can have a significant impact on the yield of high-volume manufacturing (HVM) when printed even at an extremely low probability down to parts per trillion (ppt) level. In this field, electron beam inspection (EBI) tools are regarded as a promising option to detect killer defects with enough capture rate. However, EBI requires a longer inspection time and it is pointed out as the current limitation of EBI. To overcome this limitation, throughput optimization and data collection strategy must be prepared to push the bounds of EBI capability. In this paper, we study the probability of EUV stochastic defect and its statistical signature in massive data using EBI. Tool performance for detecting defects is maximized by investigating the impact of different parameters of scanning electron microscopy (SEM) on throughput and defect capture rate. After performance verification, we demonstrate massive metrology and inspection performance of Die to Database Edge Placement Error (D2DB EPE) to extend the prediction range of stochastic defect probability down to the order of 1 defect/mm2. The method is applied to EBI results on EUV processed pitch 32nm line and space (L/S) pattern to prove the necessity of massive e-beam data analysis of low-level defectivity and intra-field variation.
Scatterometry solutions for 14nm half-pitch BEOL layers patterned by EUV single exposure
Sayantan Das, Joey Hung, Sandip Halder, et al.
To keep up with logic area scaling, BEOL dimensions have been reduced at an accelerated pace, leading to ever smaller metal pitches and reduced cross-sectional areas of the wires. As a result, routing congestion and a dramatic RC delay (resulting from an increased resistance-capacitance (RC) product) have become important bottlenecks for further interconnect scaling, driving the need for introducing new materials and integration schemes in the BEOL. The current paper studies the damascene process flow that uses a single exposure EUV to create metal lines and 2D patterns at metal half-pitch of 14nm, corresponding to the imec N5 node for logic BEOL layer. A bright field mask with a negative tone resist process was used to develop trenches and transfer these patterns into an oxide dielectric layer. Following this, the trenches were filled with ruthenium (Ru) for electrically testing. Test vehicle included multiple structures, including E-test resistance and capacitance structures, to allow a comprehensive study of the proposed process flow. Metrology requirements and performance at various process steps will be discussed in this paper. Our focus will be on the scatterometry methods that together with machine learning (ML) allow fast and accurate measurements of multiple parameters of interest at large sampling. In the current paper, we present results for inline measurements of line and space critical dimensions (CD), line edge roughness (LER) – after patterning and after hard mask etch, and the prediction of the electrical performance of the metal lines after Ru CMP. In addition, scatterometry ML capabilities for inline tip-to- tip (T2T) measurements are successfully demonstrated.
Assessment of stochastic fail rate using E-beam massive metrology
For advanced DRAM nodes, process window requirements have become extremely tight for Critical Dimension (CD) and Overlay. Determining process window based only on mean CD without quantifying defectivity on the wafer is no longer adequate. A superior approach illustrated in this paper is to generate pattern specific stochastic failure rate (FR) models that capture the non-normal nature of the fail rate distribution, using large numbers of measurements on focus-exposure-matrix (FEM) wafers. This model can be used to predict the FR contours to the part-per-billion (ppb) rate or more. The defectivity process window of course corresponds to the FR contour that meets the process FR requirement. Patterns can be grouped based on their characteristics into a single model, but generally multiple models must be generated to cover the range of printed geometries. This modeling methodology was applied to the periphery support circuitry for a DRAM technology currently in development. The particular layer involved bidirectional line/space pattern printed using DUV. Models were built and defectivity contour plots were generated for ten different patterns. Their contours at the target FR were compared with the purpose of determining if any posed a concern for high-volume manufacturing. Actually inspecting these patterns and directly measuring the FR to the PPB was impossible for two reasons: 1) The inspection time would be prohibitive 2) There are not enough replicates on the wafer to do this. To evaluate the accuracy of the final stochastic aware process windows, the FR on select patterns was directly measured and showed good agreement. This information can be used as a basis for modifying the OPC model or even the periphery design for problematic patterns. Overall this methodology provides a very efficient way to tune in patterning to meet HVM requirements.
Multivariate analysis methodology for the study of massive multidimensional SEM data
Mohamed Saib, Gian Francesco Lorusso, Anne-Laure Charley, et al.
Over the years, the reduction in the size of semiconductor devices has made their performances extremely sensitive to small differences between printed structures and intended design. As a consequence, metrology equipment manufacturers are nowadays proposing new tool configurations, able to ensure quality control in such a challenging environment by generating massive multi-properties measurement sets from inspected wafers. However, the unprecedented amount of acquired measurements and their intrinsic diversity creates a new challenge in terms of data analysis. In this work, we propose an analysis method suitable for massive multi-descriptors data sets and apply it to the processing of measurements acquired on the GS1000, the latest generation e-beam metrology tool from Hitachi. This new approach is based on the Parallel Coordinates Plot (PCP). The PCP representation is very efficient to condensate multidimensional data into a single plot, but not adapted to large data sets due to over-plotting problems. To overcome these issues, we have developed specific strategies to enable PCP to be efficient on massive data analysis by both extracting neighbors' properties by median depiction and the multi-properties dispersion. The experimental validation has been carried out over 1.7 billion Contact Hole (CH) measurements acquired on a test wafer. 28 different properties have been quantified from the e-beam images for each pattern and grouped into 3 categories: size area, edge placement error, and gap. The analysis of the full data set with the proposed methodology clearly showed the FEM fingerprint and allowed us to determine the process window based on the multi-criteria analysis. By combining the PCP with an Artificial Neural Network (ANN) we were able to model accurately the stochastic cliffs defects' density.
Late Breaking
icon_mobile_dropdown
Virtual metrology: how to build the bridge between the different data sources
In the world of today’s internet of things economy, the number of semiconductor designs is increasing rapidly. A cost effective way is needed to set up new designs for manufacturing. All available data sources need to be utilized to do the setup. In this paper we suggest two new approaches for reusing historical data for future designs: Combining historical fab-generated data with full reticle design features to predict optimal process conditions, and the concept of cross metrology integration of fab-generated data across multiple metrology steps to improve data quality.
Recent advancements in atomic force microscopy
M. van Reijzen, M. Boerema, A. Kalinin, et al.
Atomic Force Microscopy (AFM) topographic imaging has enabled semiconductor manufacturing research and development since early '90s. Unique strength over competing metrology techniques includes the potential for undistorted, local high resolution information. Comparatively slow throughput has traditionally limited high volume manufacturing (HVM) deployment. Here, we discuss the advantages of a multi-head AFM system with miniaturized high-speed SPMs working in parallel. In addition, we extend traditional AFM techniques to selective imaging and metrology of subsurface 3D structures and show a path to enabling Overlay metrology through opaque hard mask layers.
Poster Session
icon_mobile_dropdown
Broadband scatterometry at extreme ultraviolet wavelengths for nanograting characterization
Moein Ghafoori, Lukas Bahrenberg, Sven Glabisch, et al.
The authors present a novel approach for the structural characterization of periodic nanostructures using spectrally resolved broadband scatterometry in the extreme ultraviolet (EUV) wavelength range. The implemented metrology method combines 0th and ±1st diffraction order spectrum measurements of a nanograting under broadband illumination from 8 nm to 17 nm for model-based reconstruction of geometrical parameters. For the experimental investigations, a compact stand-alone scatterometer setup is designed and realized. The setup enables measurements of spectrally resolved 0th and ±1st diffraction orders of a grating that is illuminated at various grazing incidence angles. The acquired data serves as a basis for the reconstruction of the grating’s geometry using rigorous optical finite element method (FEM). The method is applied to arrays of lines and spaces with sub-100 nm feature size.
Detection and correlation of yield loss induced by color resist deposition deviation with a deep learning approach applied to optical acquisitions
Thomas Alcaire, Delphine Le Cunff, Jean-Hervé Tortai, et al.
On imager devices, color resists are used as optical filters to produce RGB pixel arrays. These layers are deposited through spin coating process towards the end of the fabrication process flow, where complex topography can induce thickness inhomogeneity effect over the wafer surface causing a radial striations signature, predominant at the edge of the wafer. This deviation can induce important yield loss but is hardly detectable with standard inline metrology or defectivity approach. In this study, an interferometry-based metrology system and a reflectometry-based defectivity system were used to gather raw optical responses on the full wafer surface. Individual die cartographies were created from those and a deep learning algorithm was trained from both optical techniques. We then applied the deep learning algorithm on a specific set of test wafers to determine the number of dies affected by striations. From there, we evaluated the correlation of the outcome classification with the final electrical tests done on each die of those wafers.
Defect characterization of EUV Self-Aligned Litho-Etch Litho-Etch (SALELE) patterning scheme for advanced nodes
Kaushik Sah, Andrew Cross, Sayantan Das, et al.
In this paper, we will investigate defect modes and pattern variations for EUV (extreme ultra-violet lithography) double pattering scheme using self-aligned litho-etch-litho-etch process on final pitch 28nm test structures. As we continuously shrink device sizes towards aggressive pitches, the industry is moving towards adopting double pattering using EUV. Although we continue to push limits of 0.33NA EUV in terms of pitch and CD (critical dimensions) with novel resists and processes, stochastic defects pose greater challenge at pitches below 40 nm [1]. One of the ways to circumvent this problem is to use a multi-patterning scheme with relaxed design rule. Self-aligned litho-etch-litho-etch (SALELE) is one such scheme for early BEOL (back end of line) layers. The benefit of this patterning approach is that no dummy metal is added and blocks are needed only at tight tip-to-tip definitions, which can help to reduce parasitic capacitance. In this paper we will employ SEM inspection techniques to understand pattern variabilities, after initial optical inspection was done to discover different defect modes. We show that with analysis of SEM images we can get further insight on process variations.
Effects of lithography process conditions on unbiased line roughness by PSD analysis
The reduction of line width and edge roughness (LWR & LER) becomes increasingly challenging with development of integrated circuit manufacturing industry, especially with the application of multi-patterning technology. Recent years, unbiased roughness method was well received and applied in LWR & LER characterization by using power spectral density (PSD) analysis. Measurement noise in scanning electron microscope (SEM) can be identified in the high frequency region of PSD curve. By subtracting electron beam noise effect, the unbiased LWR & LER are gotten. In our research, unbiased LWR & LER under different lithography process conditions, including reflectivity of bottom anti-reflection coating (BARC) materials, photo resists (PR), illuminations, post-apply bake (PAB) and post exposure bake (PEB) temperatures, were investigated by PSD analysis. For some of the above conditions, post-develop and post-etch LWR were also studied.
Mark design challenge of cut layer in FinFet
The requirement of overlay performance, which is determined by alignment process during exposure and overlay measurement process, is getting tighter as technology node shrinks in integrated circuit. Mark design has drawn a lot of attention since appropriately designed marks can guarantee process compatibility and sufficient device performance tracking property. Cut layers are widely used in FinFet to define active area formed by SADP (Self-aligned double patterning) or SAQP (Self-aligned quadruple patterning), of which the mark design is especially challenging for diffusion break layer since it is a cut layer that landing on three dimensional fin structure and will be aligned to. In this paper, mark design of diffusion break layer is investigated, including alignment marks and overlay marks with various substrates and segmentations. It’s recommended that the whole process from mark definition by lithography to final formation of mark after etch should be well taken into consideration during mark design, along with substrate and segmentation to avoid defect and achieve qualified signal as well.
Data fusion by artificial neural network for hybrid metrology development
Hybrid metrology is a promising approach to access to the critical dimensions of line gratings with precisions. The objective of this work is about using artificial intelligence (AI), mainly artificial neural network (ANN) to improve metrology at nanoscale characterization by hybridization of several techniques. Namely, optical critical dimension (OCD) or scatterometry, CD–Scanning electron microscopy (CDSEM), CD–Atomic force microscopy (CDAFM) and CD–Small angle x-rays scattering (CDSAXS). With virtual data of tabular–type generated by modelling, the ANN is able to predict the geometrical parameters compared to true measured values with high accuracies and detect irregularities in input data.
Imaging-based overlay metrology optimized by HV-SEM in 3D-NAND process
Optical overlay metrology has been adopted for years as baseline for overlay control in semiconductor manufacturing. More stringent overlay budget for securing good product yield has been required as device dimension shrinkage. For effective and tight overlay control, the traditional optical overlay metrology has faced two primary challenges of increasing the measurement accuracy and minimizing the measurement variance between overlay mark in scribe lane and in-die device pattern. Overlay mark asymmetry is one of the general factors to induce optical overlay metrology error. While 3D-NAND deep-etch processes would induce within-wafer mark asymmetry which worsens measurement robustness of optical overlay metrology. Accurately determining on-product overlay (OPO) errors at both after-develop inspection (ADI) and after-etch inspection (AEI) is also desirable in 3D-NAND process for applying non-zero offset (NZO) at photo exposure. To address the measurement robustness of optical overlay metrology in 3D-NAND process, also for accurately bridging the scribe lane based optical overlay metrology to OPO metrology, a complementary overlay metrology by high voltage scanning electron microscope (HV-SEM) was adopted as the reference metrology for optimizing the optical measurement condition on scribe lane targets. In this paper, the measurement accuracy of imaging-based overlay (IBO) target under various optical conditions was calibrated by HV-SEM. HV-SEM can measure both the scribe-lane and in-device targets via capturing buried structures, and it was employed to bridge the measurement results from IBO and in-device target. Then the optimal optical metrology can be decided for both ADI and AEI to facilitate effective advance process control (APC) and NZO purpose.
Study of high throughput EUV mask pattern defect inspection technologies using multibeam electron optics
Hidekazu Takekoshi, Riki Ogawa, John G. Hartley, et al.
Using a POC tool, we are continuing a feasibility study for an EUV mask pattern inspection tool having multibeam electron optics with D2D and D2DB functions. In this paper, we will discuss our technologies in addition to our latest results on our POC tool.
Accuracy enhancement in imaging-based overlay metrology by optimizing measurement conditions per layer
Shlomit Katz, Yoav Grauer, Raviv Yohanan, et al.
On product overlay (OPO) challenges continue to be yield limiters for most advanced technology nodes, requiring new and innovative metrology solutions. In this paper we will cover an approach to boost accuracy and robustness to process variation in imaging-based overlay (IBO) metrology by leveraging optimized measurement conditions per alignment layer. Results apply to both DUV and EUV lithography for advanced Logic, DRAM, 3D NAND and emerging memory devices. Such an approach fuses multi-signal information including Color Per Layer (CPL) and focus per layer. This approach with supporting algorithms strives to identify and address sources of measurement inaccuracy to enable tight OPO, improve accuracy stability and reduce overlay (OVL) residual error within the wafer and across lots. In this paper, we will present a theoretical overview, supporting simulations and measured data for multiple technology segments. Lastly, a discussion about next steps and future development will take place.
Process variation impacts on optical overlay accuracy signature
Rawi Dirawi, Shlomit Katz, Roie Volkovich
On Product Overlay (OPO) control is a critical factor in advanced semiconductor manufacturing. As feature sizes become smaller, OPO budgets become tighter, leaving less room for overlay (OVL) measurement inaccuracy. Over the last few years, overlay metrology’s focus has shifted inwards, towards accurate measurement conditions, as we aim to capture ever-smaller process and scanner variations. One method used to break down the OPO error budget is combining one or more accuracy flags and correlating them to various process impacts. Analyzing the overlay accuracy signature generated by accuracy flags can be useful for data validation, inspection and correlation to different processes and metrologies. In this paper, an extensive OVL accuracy experiment demonstrates the use of this new method. First, the method is applied to several wafers designed with intentional process variation, including variations in etch duration, Chemical Mechanical Polishing (CMP) duration, amorphous silicon (a-Si) thickness and titanium nitride (TiN) thickness. OVL results from the experimental wafers are compared with results from the reference (nominal) wafer.
Non-destructive depth measurement using SEM signal intensity
Jong-Hyun Seo, Changhwan Lee, Byoungho Lee, et al.
In this paper, we introduce a novel approach measuring the pattern depth, using non-destructive CD-SEM platform. We derived a dimensionless metrics called as “depth index,” that is designed to be proportional to the pattern depth. The depth index is calculated by using the SEM signal intensity and the pattern geometry accessible by normal CD-SEM. As a proof-of-concept, the depth index is obtained on the etched hole patterns fabricated in 300 mm wafer with different depth, and the depth correlation with reasonable measurement repeatability of 1% is confirmed. The depth index has been applied to the process variation monitor in NAND Flash memory, and the local depth variation of the holes of 4% is confirmed. The intra-wafer variation of 7-10%, and the wafer-to-wafer variation have been also detected.
Absolute coordinate system adjustment and calibration by using standalone alignment metrology system
Satoshi Ando, Haruki Saito, Sayuri Tanaka, et al.
We propose a construction and calibration method of an absolute coordinate system for lithography tools. In the conventional overlay control system, subsequent layers are overlaid on the 1st layer that could have unknown wafer distortion. This is inefficient, because the distortion of the 1st layer may be large. A high-level overlay control can be realized with the absolute coordinate system by correcting the 1st layer to zero distortion. This can be possible by using absolute alignment metrology system. In order to confirm our theory, we report experimental results of the absolute grid construction and matching accuracy among multiple lithography tools.
Study of further image performance improvement by spectral bandwidth for ArFi lithography lightsource
Takamitsu Komaki, Shinichi Matsumoto, Koichi Fujii, et al.
DUV lightsources used in lithography process is enabling to shrink the feature size as a driver for Moore’s Law in order to make semiconductor devices to a new level. Multi-patterning processes has been applied to break through the ArFi resolution limit. These processes require the pattern fidelity and placement accuracy.We have studied an impact of bandwidth distribution and asymmetry by imaging simulation. The results show the bandwidth shape and asymmetry cause a change in the CD. Our study shows that Gigaphoton‘s latest light source can contribute to improvements of fidelity and EPE.
An accurate and robust after-develop overlay measurement solution using YieldStar multi-wavelength optical metrology accompanied by a precise application strategy
The technology evolution of 3D-NAND storage devices requires an extensive research and development (R and D) phase with frequent process changes during the film deposition, lithography and etching steps. These process changes might have an impact on the accuracy of overlay measurements, thereby influencing the on-product overlay performance. Besides, by increases of 3D-NAND storage density, the available space for overlay metrology error is significantly reduced. The combination of frequent process changes in R and D phase and tighter overlay metrology budget, increases the necessity of an accurate, yet robust overlay metrology solution that can be adopted for 3D-NAND development phase. In addition to that, such a metrology solution needs to have the ability of being smoothly transferred to the ramp and eventually to the high volume manufacturing (HVM) stage, where the metrology throughput is playing a significant role in terms of cost of ownership reduction. In this paper, we report the YieldStar multi-wavelength diffraction-based overlay (μDBO) metrology as a solution to address the above challenges. Unlike the conventional optical overlay metrology methods which use single light wavelength, this diffraction based technique uses multiple wavelengths to measure every single overlay metrology targets, which proves to be robust against process variation induced metrology errors. In order to demonstrate the advantages of this new metrology solution, the accuracy, robustness, and throughput performance of the multiwavelength μDBO metrology technique are evaluated in the YMTC 3D-NAND manufacturing process. In addition, a well-defined application strategy is developed for reducing the number of measurement wavelengths by the maturity level of process which results in a robustness gain without impacting the HVM throughput requirements.
Introducing machine learning-based application for writer main pole CD metrology by dual beam FIB/SEM
Dual beam focused ion beam/scanning electron microscopy (FIB/SEM) is a critical characterization technique that is used as inline metrology from early stages of process developments until high volume manufacturing (HVM) of magnetic read/write heads in hard disk drive (HDD) due to the complex three-dimensional geometry [1]. Despite its destructive nature, FIB/SEM metrology is critical to support high throughout manufacture process for advanced process control during HVM in HDD industry. Final cross-sectional SEM images typically include several CD measurements and embedded or standalone standard machine vision applications are used as part of the metrology process. However, these applications are typically not able to accommodate various process changes during the rapid process development, and manual engineer assistance are often needed for the accurate cut placement and SEM search. On the other hand, optimization of machine vision application typically requires a reasonable number of images to allow training and optimization of edge finder and pattern recognition functions. Reducing the training and optimization time needed for machine vision applications reduces the learning time during new process development. In this work, we are introducing a machine learning based metrology application that minimizes the need for engineer involvement for recipe optimization during the rapid process development [2]. By addition of the process margin entities to the machine learning model, the recipe robustness is significantly improved at the time of transition to new product introduction (NPI) and high volume manufacturing (HVM). We compare the new machine learning based metrology application against the legacy machine vision application and study its impact on recipe writing time, wafer to wafer variations, and total measurement uncertainty (TMU). The new application allows recipes capable of cross-design metrology.
A novel method of overlay variation study for 3D NAND channel hole
In recent years, the pursuit of high storage capacity in 3D-NAND flash devices has driven the addition of more layers to increase the stack height. Challenges arise when etching high aspect ratio memory holes. Due to the existence of a thick and opaque hard mask layer, overlay control faces significant lot-to-lot variation and difficulty of run-to-run feedback control. In this paper, a fundamental study on channel hole overlay variation is revealed by collecting and analyzing step-by step overlay, etch tilt and stress data. The strong correlation between overlay/tilt/stress identifies the main contributor of overlay lot-to-lot variation to be from etch tilt, which also strongly correlates to etch chamber RF hour (accumulated hours the chamber has run since its last PM event) without chamber dependency. In addition, overlay simulations showed lots grouped by RF hour can effectively reduce lot-to-lot overlay variation.
AFM line space trench and depth measurement of fan-out fine-pitch high aspect ratio redistribution layer structure
As the fan out advantage assembly packages of redistribution layer (RDL) line and space develop to more narrow to 2um / 2um, and with deeper structure. The accurate critical dimension (CD) and depth measurement are necessary to monitor process quality. There is the limit of resolution about 0.25um for traditional optical instrument, it is too large to measure narrow line and space in high accuracy. The non-contact wafer form atomic force microscope (AFM) is apply for this application with the high aspect ratio tip. This method has been used to measure isolate and dense RDL pattern includes copper etching and photoresist structure. The accuracy between AFM and focus ion beam (FIB) are less than 0.1um on top and bottom CD of redistribution layer.
Accuracy aware pixel selection in multi-wavelength uDBO metrology enables higher robustness and accuracy for DRAM
Advancing technology nodes in DRAM continues to drive the reduction of on-product overlay (OV) budget. This gives rise to the need for OV metrology with greater accuracy. However, the ever increasing process complexity brings additional challenges related to metrology target deformation, which could contribute to a metrology error. Typically, an accurate OV measurement involves several engineering cycles for target and recipe optimization. In particular, process optimization in either technology development (TD) phase or high volume manufacturing (HVM) phase might influence metrology performance, which requires re-optimization. Therefore, a comprehensive solution providing accuracy and process robustness hereby minimizing the cycle time is highly desirable. In this work, we report multi-wavelength µDBO enhanced with accuracy aware pixel selection as a solution for robust OV measurement against process changes as well as improved accuracy in HVM. Accuracy aware pixel selection is capable of tackling intra-target processing variations and is established on a multi-wavelength algorithm with immunity to target asymmetry impact. DRAM use cases in FEOL critical layers will be discussed in this paper. Superior robustness and accuracy will be demonstrated together with improved on-product OV performance, promising a process of record metrology solution in specific applications throughout the TD and HVM.
Method to improve the overlay image contrast and optimize the sub-segmentation mark
Overlay marks contrast plays such a fundamental role for the overlay metrology that it dominants the accuracy control and draws much attention during mark evaluations. However, as the lithography nodes advances and the 3D device stacks accumulate, overlay marks sub-segmentation and opaque hard-masks are widely exploit, both of which can be drawbacks for mark contrast in certain cases. As a result, the overlay mark contrast improvement techniques become a necessity, and attract more research interests. In this paper, we introduced a method to improve the mark contrast by using pixel improvement technique and theoretical reference correlation analysis. Such method can potentially be very effective when measuring overlay marks with extreme low contrast through thick opaque layers.
Excursion detection and root-cause analysis using virtual overlay metrology
Leon van Dijk, Kedir M. Adal, Mathias Chastan, et al.
Overlay is one of the most critical parameters in Integrated Circuit (IC) fabrication as it is a measure for how accurate patterned features are positioned with respect to previously patterned features. Without good overlay, electrical contacts between features will be poor and there can be shorts or opens. Minimizing overlay errors during IC manufacturing is therefore crucial for ensuring high yield and that the performance and reliability specifications of the eventual device are met. For that reason, metrology plays a crucial role in IC fabrication for monitoring the overlay performance and process control. However, due to its high capital equipment cost and impact on cycle time, it is practically impossible to measure every single wafer and/or lot. This means that some excursions cannot be captured and that process drifts might not be detectable in an early phase. Virtual metrology (VM) addresses these challenges as it aims at utilizing the significant amounts of data that are generated during manufacturing by the lithography clusters and other processing equipment, for constructing mathematical and statistical models that predict wafer properties like overlay. In this way, overlay excursions and process drifts can be detected without actually measuring the overlay of these wafers. Preferably, VM is also able to link these excursions and drifts to particular root causes, enabling operators to take preventive measures timely. In this work, we develop virtual overlay metrology for a series of implant layers using a combination of physical and machine learning models. The implant layers relate to ion implantation steps following the Shallow-Trench-Isolation (STI) creation, and both the implant and STI layers are exposed using multiple lithography scanners. A physical model is used to address overlay contributors that can be derived directly from available data. Machine learning algorithms, which are able to learn models from data that can provide predictions for similar, unseen data, are used to predict contributions from less obvious sources of overlay errors. The capability of the overlay prediction model is evaluated on production data. A prediction performance of ~0.7 is achieved in terms of the R-squared statistic and the VM is able to follow variations in the implant-layer overlay and to detect excursions. The excursions can originate from correctable as well as from non-correctable overlay errors. We will show that the interpretability of the prediction model allows us to identify the root cause for the high correctable error variation in the implant-layer overlay. Furthermore, overlay contributors will be identified that may not have a direct impact on the less critical overlay of implant layers. However, they may contribute significantly to the Gate-to-STI overlay as well, and we will show the potential of virtual overlay metrology for downstream layer excursion detection.
Robustness improvement in imaging-based overlay metrology for high topography layers by Talbot targets
As 3D NAND devices increase memory density by adding layers, scaling and increasing bits-per-cell, new overlay (OVL) metrology challenges arise. On product overlay (OPO) may decrease for critical thick layers such as thick deck-to-deck alignment, whereas high aspect ratio (Z-axis) structures introduce stress, tilt and deformation that require accurate and robust OVL measurements. Advanced imaging metrology (AIM®) targets, that consist of two side-byside periodic gratings in the previous and current layers, are typically used to measure OVL with Imaging Based Overlay (IBO) metrology systems. In this paper, we present a new approach that utilizes the Talbot effect in AIM to produce multiple contrast planes along the Z-axis, which enables a common focus position for both layers at a similar focus plane, resulting in improved measurement robustness. We will present Talbot effect theory, target design steps by metrology target design (MTD) simulator, actual measurement results on an advanced 3D NAND device and conclusions for such targets.
Lithography PR profile improvement and defects reduction by film pre-treatment
Photoresist (PR) profile and defect are very important in semiconductor manufacturing, the abnormal PR profile and defect will cause the following process abnormal even affect the device performance. For example, under-cut PR profile is easy to pattern collapse, while footing will cause the pattern bottom bridge, both of these will cause device hard fail. PR profile is affected by lots of factors, such as bottom reflectivity, PR properties, focus, etc. But an important impacted factor is often ignored, that is the film property contacting PR. If the property of under film does not match with PR, it will affect PR profile seriously, even bring defect issue. So investigating the friendly interface is a meaningful topic. In this paper, we focus on the effect of film treatment on PR profile and defect. Through the experiments, the methods including O2 treatment, heat treatment, and wet treatment, the friendly methods to lithography are chosen, and the possible mechanism is proposed. The film treatment method provides us a new way to improve defect or PR profile, and further improve the lithography process window.
Automated extraction of critical dimension from SEM images with WeaveTM
Leandro Medina, Bryan Sundahl, Roger T. Bonnecaze, et al.
Semiconductor process engineers currently spend almost 10% of their time extracting critical dimensions from microscope images. Images are analyzed one by one, which is tedious, prone to human bias, time-consuming and expensive. Accurate, automated detection of edges and different materials in a stack are the key technical challenges for computer-extracted critical dimensions (CDs). Here we demonstrate the performance of a method for edge detection and material detection via segmentation methods embodied in the software tool Weave™. This-approach uses optimized thresholding via a level set method to identify multiple edges and materials without the need of extensive, annotated, experimental training data. The method is evaluated based on accuracy (prediction of CDs) and materials identification (ability to identify the different materials in an image). Based on evaluation of the method with 20 test SEM images, the method’s performance is excellent. Ninety percent of the CDs measured from the automated analysis are within 2% of the actual values. The errors for the remaining 10% of measurements range from 4-9%.
In-line applications of atomic force microscope based topography inspection for emerging roll-to-roll nanomanufacturing processes
Liam G. Connolly, Michael Cullinan
This paper seeks to demonstrate the efficacy of a new approach for fast, in-line, and direct topography measurement of nano-scale structures and features on a flexible substrate, or web, in a roll-to-roll fashion. Nanofeatured products manufactured with R2R processes can be extremely cost competitive compared to more traditional, wafer-based solutions in addition to their unique and desirable mechanical properties. As such they are an area of immense research interest. But, despite the promise of these products for a plethora of applications, the leap from lab-scale prototypes to pilot- or volumescale manufacturing has proven extraordinarily difficult and expensive — with both required research and development investment and achievable process yield proving sizable barriers. A key capability gap in current art and roadblock on the path towards more widespread research and adoption of these R2R fabricated products is the lack of high-throughput, nanometer-scale metrology for process development, control, and yield enhancement. In this work a new type of extremely compact, tip-based microscope designed and fabricated with a micro-electro-mechanical system approach is applied to the challenge of direct topography measurement for roll-to-roll fabricated nanopatterns. A proof-of-concept tool with subsystems to regulate the flexible web, isolate and position the atomic force microscope, and measure features on the substrate, all coordinated by a real-time embedded control system is shown and step-and-scan measurement results were acquired. However, to genuinely meet this extent need for roll-to-roll metrology, a system capable of atomic force microscope scanning despite a continuous, non-zero web velocity must be developed to meet throughput requirements without degrading measurement quality and thus help to enable the next generation of R2R nanomanufacturing technology.
Memory OPO improvement using novel target design
Eitan Hajaj, Honggoo Lee, Chanha Park, et al.
Reduction in on product overlay (OPO) is a key component for high-end, high yield integrated circuit manufacturing. Due to the continually shrinking dimensions of the IC device elements it has become near-impossible to measure overlay on the device itself, driving the need to perform overlay measurements on dedicated overlay targets. In order to enable accurate measurement on grid (target) in terms of OPO matching, the overlay mark must be as similar as possible to the device in order to mimic the process impact on the device. Imaging-based optical overlay (IBO) provides the best accuracy and robustness for overlay metrology measurements for many process layers. To further optimize IBO performance, a new robust AIM (rAIM™) IBO target design was developed, using the Moiré effect. rAIM is implemented using significantly smaller pitches compared with the standard AIM® target, hence providing a more device-like target design. This new target design has the potential to improve target accuracy and robustness, to improve measurability, and to meet overlay basic performance requirements, such as total measurement uncertainty (TMU).
Advanced overlay target optimization and Integration in optical metrology
Eitan Hajaj, Diana Shaphirov, Eltsafon Ashwal, et al.
In recent years, simulation-based analysis has become an integral phase in metrology targets design process, performances optimization wise to support on product overlay (OPO) reduction, accuracy and robustness to process variation. Moreover, a simulated unit (stack) represented by its optical and geometrical properties can be used as a mathematicalphysical object for obtaining a deeper understanding of the issues faced while an actual measurement performed. Location based stack calibration allows for both, symmetrical and asymmetrical process variation, a noticeable wafer signature to be attained. Using this information, one can analyze the target-design process-compatibility and asymmetry stability. Furthermore, simulated data can be used, combined with measured data, to establish a more exhaustive perceiving of the process characteristics and risks, hence maxims the measurement performances and stability of the process and target behavior. Likewise, simulation tools can equip integration teams with a more holistic apprehension and quantified data, prior or along real time measurements. In the paper we will cover the simulation theory, use-cases and results.
Investigation and optimization of STI dry-etch induced overlay through patterned wafer geometry tool
Tsu-Wen Huang, Ying-Cheng Chuang, Hsuan-Jui Huang, et al.
When a novel dry-etch tool was introduced for the shallow trench isolation (STI) process, it resulted in poorer overlay performance downstream. The patterned wafer geometry (PWG™) tool was utilized to investigate the observed difference in results between the new tool and the POR tool. By using metrics representing process-induced local shape and/or stress, a post-STI oxide fill rapid thermal process (RTP) was identified as the process step where the difference between the dry-etch tools was magnified. An experiment based on RTP temperature and ramp rate was conducted and the effect was evaluated using both GEN4–a predicted shape overlay metric derived from PWG shape measurement–and overlay measurements. Both GEN4 and overlay results indicated that high RTP temperature and low ramp rate could compensate for the process effects introduced by the new etch tool. The strong correlation between GEN4 and overlay also suggested that GEN4 may be a suitable upstream predictor of process-induced overlay excursion in such a case.
Plasma assisted particle contamination control: plasma charging dependence on particle morphology
With the introduction of EUV lithography, the control of contamination in advanced semiconductor processes has become increasingly critical. Our work is a joint effort (TU/e and VDL-ETG) and is aimed at the development of plasma-assisted contamination control strategies mainly focusing on airborne particles in a low pressure gas. We present experiments comparing the charge-to-mass ratio of single spherical micron-sized particles with that of non-spherical agglomerates thereof in the spatial plasma afterglow. It is shown that the charge-to-mass ratio of two-particle clusters deviates only 6% from that of singlets. This means that for the proposed mitigation strategy, of which the efficiency is based on the charge-to-mass ratio, it is acceptable to study the charging of spherical particles and to extrapolate the results towards non-spherical particles within a reasonable range.
A study on diffraction-based overlay measurement based on FDTD method
Buqing Xu, Qiang Wu, Rui Chen, et al.
As the semiconductor manufacturing critical dimension continues to shrink, the requirement placed on overlay control becomes much more stringent. Due to the fact that the absolute overlay tolerances are approaching 2-3 nm, process induced errors can be a major contribution to the overlay error. For instance, chemical mechanical polishing (CMP) are found to cause 1-3 nm overlay measurement error, which is of the same magnitude to the total overly budget. Because of this, efforts are being made to investigate the mechanism of overlay shift caused by process variations. In this paper, we present a study of the Diffraction based Overlay (DBO) metrology with a model based on the Finite-Difference Time-Domain (FDTD) method on the impact of CMP process to overlay measurement. Measurement error caused by CMP are discussed. Our investigation shows that the impact of the CMP process can cause the +/- diffraction orders to become asymmetric, which will confuse DBO measurement signals. This study has been performed across the visible illumination spectrum and the result of our study will be illustrated.