Paper
24 March 2008 Empirical data validation for model building
Author Affiliations +
Abstract
Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k1 requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining questionable measurement points for engineering scrutiny since they may run the risk of incorrectly skewing the model. In addition to purely statistical data curve fitting, another concept also merits investigation, that of using first principle, simulation-based characteristic coherence curves to fit the measured data.
© (2008) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Aram Kazarian "Empirical data validation for model building", Proc. SPIE 6922, Metrology, Inspection, and Process Control for Microlithography XXII, 69221I (24 March 2008); https://doi.org/10.1117/12.773414
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Optical proximity correction

Process modeling

Photoresist processing

Statistical modeling

Calibration

Critical dimension metrology

RELATED CONTENT


Back to Top