AVS2002 Session AS-MoA: Quantification & Accuracy in Surface Analysis
Time Period MoA Sessions | Abstract Timeline | Topic AS Sessions | Time Periods | Topics | AVS2002 Schedule
Start | Invited? | Item |
---|---|---|
2:00 PM | Invited |
AS-MoA-1 Toward a Comprehensive Quantitative Workbench for Surface Analysis
R.A. Weller (Vanderbilt University) I will address issues in the application of symbolic computation to surface analysis. Until quite recently, the most important factors affecting the style of technical software have been the limitations imposed by the speed and storage capacity of contemporary computing hardware. In retrospect, while understandable in the context of the times, this linkage has produced computational tools that lack generality, are inflexible, or that must be frequently updated because of evolving computer hardware or operating system software. The seeds of an alternative approach have been sown by the authors of modern tools for general-purpose symbolic mathematical computation, where fundamental considerations argue for hardware independence and the generality of algorithms. Symbolic computation is a revolutionary computing technology. Mathematics is an exercise in discovering patterns and manipulating symbols according to complex and exceedingly numerous, but well defined and self-consistent rules. When advances in computer speed and memory capacity made it possible to store and implement these rules automatically, the stage was set for a revolution on a scale comparable to the revolution produced by automatic numerical computation five decades ago. Some implications of this revolution for the field of surface analysis will be presented, through examples drawn from medium energy backscattering spectrometry, four-point probe measurements, and radiation effects in semiconductors. The distinctive properties of an extensible surface analyst's quantitative workbench will be discussed. An important conclusion is that most technical software now being written should be based on robust algorithms and fidelity to correct physics without (much) regard for the characteristics of the hardware on which it will initially be implemented. This work has been supported in part by the U.S. Army Research Office through grant DAAD 19-99-1-0283. |
2:40 PM |
AS-MoA-3 Quantitative XPS and the Morphology Problem : Simple Algorithm for the Amount of Substance at the Surface
S. Tougaard (University of Southern Denmark) It is well known that due to electron attenuation, the measured XPS-peak intensity depends strongly on the in-depth atom distribution. Quantification based only on the peak intensity leads therefore to huge uncertainties. The problem was basically solved by developing models for the detailed analysis of the energy distribution of emitted electrons leading to algorithms summarized in.footnote1 These algorithms have been extensively tested experimentally and found to be able to determine the morphology of surfaces with quite high accuracy.2 Practical application of these algorithms has increased after ready to use software packages were made available3 and they are now being used in labs worldwide. These software packages are easy to use but they need operator interaction. They are not well suited for automatic data processing and there is an additional need for simplified strategies that can be automated. In this paper we report on a very simple algorithm that takes the morphology effect into account. It is a slightly more accurate version of the algorithm previously proposed by Tougaard (eq.(8) in4). Although it was proposed more than a decade ago, the practical applications of this simple formula has not previously been studied in any great detail. The algorithm gives the amount of atoms within the outermost 3 IMFPs with a good accuracy and it also gives a rough estimate for the in-depth morphology. The validity of the simple algorithm is tested on several experimental systems and the results are compared to analysis of the same samples quantified by more accurate methods. |
|
3:00 PM |
AS-MoA-4 The Information Depth and the Mean Escape Depth in Auger Electron Spectroscopy and X-ray Photoelectron Spectroscopy
A. Jablonski (Polish Academy of Sciences, Poland); C.J. Powell (National Institute of Standards and Technology) The information depth (ID) is a measure of the sampling depth for the detected signal in Auger-electron spectroscopy (AES) and x-ray photoelectron spectroscopy (XPS) while the mean escape depth (MED) is a measure of surface sensitivity. We report ID and MED calculations for Si 2s, Si 2p3/2, Cu 2s, Cu 2p3/2, Au 4s, and Au 4f7/2 photoelectrons excited by Mg Kα x rays. Similar calculations were made for Si L3VV, Si KL23L23, Cu M3VV, Cu L3VV, Au N7VV, and Au M5N67N67 Auger transitions. The ID and MEDs were derived from an analytical expression for the electron depth distribution function obtained from a solution of the kinetic Boltzmann equation within the transport approximation. The ratios of the IDs and the MEDs to the corresponding values found if elastic-electron scattering effects were negligible, RID and RMED, were less than unity and varied slowly with electron emission angle α for emission angles less than 50°. For larger emission angles, these ratios increased rapidly with α. For α ≤ 50°, average values of RID and RMED varied linearly with the single-scattering albedo, ω, a simple measure of the strength of elastic scattering effects. For α = 70° and α = 80°, RID also varied linearly with ω but RMED showed a quadratic variation. As a result of elastic scattering of the signal electrons, AES and XPS measurements at α = 80° are less surface-sensitive than would be expected if elastic scattering had been neglected. Conversely, AES and XPS measurements made for α ≤ 50° are more surface sensitive as a result of elastic-scattering effects. |
|
3:20 PM |
AS-MoA-5 Wavelets: A New Technique for Spectral Processing in Surface Science - Applications to Filtering and Deconvoluting HREELS and XPS Data
C. Charles, J.P. Rasson, G. Leclerc, P. Louette, J.J. Pireaux (Facultés Universitaires Notre-Dame de la Paix, Belgium) Last decade has witnessed the emergence of new powerful signal analysis tools: the wavelet transform is one of them.1 By simultaneously taking into account both the time and frequency domains, a wavelet analysis is a priori more efficient and covers a larger spectrum of applications than the Fourier Transform. The wavelet theory will be briefly presented, with comparison to Fourier analysis. Three applications for HREELS (High Resolution Electron Energy Loss Spectroscopy) and XPS (X-Ray Photoelectron Spectroscopy) will follow: noise filtering, peak detection and deconvolution. We first use synthetic data, a quite common practice in statistics: the correct answer is indeed known; it is thus possible to assess the validity and robustness of the algorithms, under clear hypotheses; errors can be calculated. The filtering algorithm proceeds with an original ‘Local in Time and Frequency Translation Invariant Poisson Smoothing’ code, that adapts itself to a spectrum containing peaks of very different amplitudes. Regions with a local maximum is then automatically detected with wavelets, allowing a Localized Least Squares method to precisely locate and determine the intensity of a peak. Different applications on real HREELS and XPS data are illustrated; they are particularly encouraging. |
|
3:40 PM |
AS-MoA-6 Ultra Thin SiO2 on Si: Quantification of the Oxide Thickness and Carbonaceous Contamination
M.P. Seah, S.J. Spencer (National Physical Laboratory, UK) An analysis is made of the quantification issues in the measurement of ultra-thin layers of SiO2 and of carbonaceous contamination on (100) and (111) polished Si surfaces. For the analysis of the oxide thickness, a simple equation is generally used involving two parameters; the attenuation length of photoelectrons in the oxide, L, and the ratio, Ro, of the intensities of the Si 2p peak from bulk thermal SiO2 and from pure Si. An analysis of previously reported measurements of L gives an average value of only 6% less than the theoretical value. However, careful measurements of Ro, via two routes, indicate consistently that a value of 0.88 ± 0.05 should be used rather than the calculated value of 0.53 ± 0.05. This difference may arise through systematic errors in the values for the relevant inelastic mean free paths, the silicon dioxide density and the shake-up intensity contributions. Previously reported experimental values of Ro range from 0.67 to 0.87. Sources of uncertainty in these parameters and in the thickness will be addressed. Measurements of a basis set of materials for an international study, started in March 2002 under the auspices of the Consultative Committee for Amount of Substance (CCQM), show average correlations with ellipsometry better than 0.13 nm over the thickness range 2 nm to 8 nm. Measurements of the carbonaceous contamination show how to clean and store the samples effectively and the relevant parameters for a consistent carbon quantification. |
|
4:00 PM |
AS-MoA-7 Correcting for Detector-Induced Non Linearity in Photoelectron Spectroscopy Counting Systems
C.S. Fadley, N. Mannella, S. Marchesini (University of California at Davis); A. Kay (Intel Corporation); S.-H. Yang (IBM Almaden Research Center); S. Mun (Intel Corporation); M.A. van Hove (Lawrence Berkeley National Laboratory) The photoelectron intensity levels reached in exciting surfaces with both laboratory sources and third-generation synchrotron radiation can in many cases exceed the linear response range of the final detection system involved.footnote1 For example, the quantitative analysis of complex oxides via core-level intensities has been found to be strongly influenced by this non-linearity,footnote2 as have angle-resolved valence spectrafootnote3 and magnetic dichroism measurements on magnetic systems.footnote2 Experiments involving resonant photoemission, in which the photon energy is scanned through an absorption edge, are also strongly affected.footnote4 In this paper, we demonstrate two quantitatively accurate procedures to correct for such non-linearity effects. The first method directly yields the detector efficiency by measuring a flat-background reference intensity as a function of incident x-ray flux, while the second method determines the detector response from an analysis of broad-scan survey spectra at different incident x-ray fluxes. Although we will use one spectrometer system (the Scienta ES200) as an example, the methodologies discussed here should be useful for many other cases. This work was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Materials Sciences Division, under Contract No. DE-AC03-76SF00098. |
|
4:20 PM |
AS-MoA-8 Theoretical Foundations of Surface Stress Measurements using Atomic Force Microscope Cantilevers
J.E. Sader (The University of Melbourne, Australia) Due to its extreme sensitivity and speed, the atomic force microscope (AFM) has recently emerged as an important tool in the measurement of surface stress. Fundamental to this application is theoretical knowledge of the effects of surface stress on the deflections of AFM cantilever plates. This is normally obtained by use of Stoney's equation, which is derived for a completely unrestrained plate. In this talk, the validity of Stoney's equation to rectangular and V-shaped AFM cantilever plates is investigated. It is found that use of Stoney's equation can lead to significant errors in measurements made using AFM cantilevers. Detailed finite element results and new analytical formulae, which replace Stoney's equation and greatly improve on its accuracy, shall be presented. |
|
4:40 PM |
AS-MoA-9 Elementary and Structure Analysis of Si Wafers and Thin Films by Using an X-ray Waveguide-Resonator
V.K. Egorov, E.V. Egorov (Russian Academy of Science (IPMT RAS), Russia) Analysis of composition and structure for Si wafers surface and thin films is carried out at X-ray beam grazing incidence (TXRF, GIRD). The methods fulfillment requires creation of an X-ray line beam with small width, low divergence and high radiation density. Monochromatization of the beam is nonobligatory demand. Such X-ray beam is formed indeed by planar X-ray waveguide-resonator (PXWR).1 PXWR is the narrow extensive slit formed by two planes polished dielectric reflectors. The slit size must fall into the certain size interval. At quartz reflectors intervals for MoKα and CuKα are limited by sizes: 15-45 nm and 15-95 nm. Waveguides capture radiation in the angle aperture Δθ<2θc, where θc is the total reflection critical angle. Realistic emergent beams have width 50-100 nm, hight 1-10 mm, divergence Δθ<2θc and radiation density exceeded one in standard formed beam systems by 3-4 orders. Composite PXWR has the emergent beam divergence Δθ<<2θc at preservation of a total intensity. Schemes of diffractive and spectroscopic devices equipped by PXWR are considered. Data for X-ray beam diffraction on thin films at its grazing incidence and for focusing scattering scheme are presented. There are discussed X-ray fluorescence spectra collected at the grazing incidence on film targets of beams formed by PXWR. The model for the spectrum treatment taking into account PXWR using for the target excitation is formulated. PXWR application leads to falling of element detection limit more than one order in comparison of the standard TXRF spectrometer scheme. TXRF device with PXWR is more cheap, more simple in exploitation. PXWR can be used for TXRF study of Si wafers with d>300 mm and adapted for "in situ" measurements.
|