AVS 70 Session AS-ThM: Machine Learning and Data Evaluation

Thursday, November 7, 2024 8:00 AM in Room 117
Thursday Morning

Session Abstract Book
(326KB, Oct 31, 2024)
Time Period ThM Sessions | Abstract Timeline | Topic AS Sessions | Time Periods | Topics | AVS 70 Schedule

Start Invited? Item
8:00 AM AS-ThM-1 Redox XPS: Reliable and Automatic Peak Fitting of XPS Chemical States
Peter Cumpson (Sanispectra Ltd); Dhilan Devadasan (Thermo Fisher Scientific, UK); Robert Weatherup (Oxford University, UK); Silvia Gazzola (University of Bath, U.K.); Tim Nunney (Thermo Fisher Scientific, UK)

At the AVS symposium in 2023 we introduced Redox XPS, which removes ambiguities in fitting narrowscan XPS spectra by gas-phase oxidation (or reduction), typically with the assistance of ultraviolet light, within the XPS instrument. Since then we have automated the procedure, so that for a set of samples on a sample block one obtains a montage of spectra, each montage representing a progression of oxidation state. This takes more "wall-clock" time, but no more operator time, than a single spectrum.

Redox XPS is useful to remove some ambiguities, especially around any very small peaks from unexpected elements that are otherwise difficult to identify, but the principal advantage of Redox XPS is the ability to do automatic peak-fitting in a reliable and quantitative way. Instead of comparing with spectra obtained using other instruments (for example libraries of spectra acquired on instruments which may have different transmission functions, energy calibrations or energy resolution) one can reach a reliable quantification from relative measurements within a set of spectra acquired on one instrument on the same day.

To extract peak shape and intensity corresponding to the plural chemical states originally present in the sample we here apply;

  1. A regularization method to avoid over-fitting, by including contraints arising from the physics and chemistry of XPS peak shapes,
  2. Non-negativity constraints (valid for all real spectra, where counts and concentrations can never be negative), and
  3. A feature of the inelastic background that we identify for the first time, and that has somehow escaped the notice of the community for the 30 years or more that XPS background subtraction has been common.

This algorithm is tested on Redox XPS data from 12 different elements acquired automatically. In most cases (and at the expense of more instrument time) this solves the problem of peak-fitting that otherwise can take up the valuable time and skills of XPS experts. Further, the ability to automatically fit complex peak-shapes means that Auger features that have often been ignored in the past (due to their prohibitively complex line shapes) will become very useful to analyse quantitatively by this procedure. Most of the data we show are from progressive oxidation using UV and ozone, but we have also demonstrated reduction in some cases.

8:15 AM AS-ThM-2 Exploring the Benefits of Automated, Redox Reactions in XPS Analysis
Robin Simpson, Tim Nunney, Paul Mack (Thermo Fisher Scientific, UK)

This presentation investigates the benefits of automated, in-situ redox reactions for the purpose of producing well controlled oxide growth on the surface of various sample types. The driving force behind using such a procedure is in the potential for generating a sequence of spectra from a progressively chemically-modified surface to remove ambiguities that can lead to misinterpretation, thus aiding in faster understanding of the unmodified surface. Our study presents XPS results from coupled stepwise oxidation/reduction of surfaces, to aid in resolving such ambiguities across a wide array of materials. We use gas-phase oxidation agents to control the redox states of a specimen, leveraging the logarithmic growth of oxide thickness. This oxidation is implemented using vacuum ultraviolet light (VUV) and the generation of ozone and gas-phase hydroxide free radicals close to the surface of the specimens within the entry-lock of the Thermo Scientific Nexsa surface analysis instrument. This work focusses on the benefits of automating this process to ascertain the potential merits of including it into a standard operating procedure for XPS analysis.

8:30 AM AS-ThM-3 Fourier Denoising of X-ray Photoelectron Spectroscopy Data
Matthew Linford, Alvaro Lizarbe, Kristopher S. Wright (Brigham Young University); Jeff Terry (Illinois Institute of Technology); David E. Aspnes (Brigham Young University)

There has long been something of a prohibition on the smoothing/denoising of X-ray photoelectron spectroscopy (XPS) data. In this talk, we reconsider this possibility.

Fourier analysis is powerful for data analysis because it allows one to separate the information in a spectrum in a way that is not possible in direct space. This separation takes place because information in spectra, including XPS spectra, is usually in point-to-point correlations, which ends up in low index Fourier coefficients, while noise is in point-to-point variations, which ends up in high index Fourier coefficients. One of the pillars of Fourier theory is the convolution theorem. It states that convolution in one domain (direct or reciprocal) is equivalent to multiplication in the other.

In this talk, we begin by evaluating the effectiveness of common smooths like the boxcar and Savitzky-Golay smooths. In both cases, these smooths are flawed. They lack the ability to fully remove high frequency noise from data. We then discuss the use of the Gauss-Hermite (GH) filter for removing noise from Fourier transformed XPS spectra. This adjustable filter is unity, or near unity, for lower index Fourier coefficients, but drops off smoothly to zero for the higher index coefficients. We show the use of this filter for a relatively broad Ag 4s XPS peak, a narrow scan (Ag 3d) with sharp, spin-orbit components, and a metal peak that shows a significant step in the baseline. Various positions of the GH filter are considered in each case (this filter function is adjustable). We compare our results to 'true' spectra obtained by significant signal averaging. We make the case that appropriate Fourier filtering of XPS data may have a useful place in XPS data analysis. We also provide cautions for using this capability.

8:45 AM AS-ThM-4 Fourier Denoising of X-ray Photoelectron Spectroscopy Data. Applications to the carbon Auger D parameter, HAXPES, and EasyEXAFS
Alvaro Lizarbe, Kristopher Wright, Gavin Murray, Garrett Lewis (Brigham Young University); Mark Isaacs (Diamond Light Source, UK); David Morgan (Cardiff University); David Aspnes (North Carolina State University); Matthew Linford (Brigham Young University)

We recently argued for the use of Fourier denoising with the Gauss-Hermite filter as a tool for separating the signal and noise in XPS spectra. The Gauss-Hermite filter more effectively removes high frequency noise from spectra than more traditional Savitzky-Golay and boxcar smooths. This type of approach works because noise is contained in point-to-point variations, while signal is in point-to-point correlations. In this talk, we discuss practical ways of implementing this capability. For example, the carbon Auger signal needs to be smoothed before it is differentiated to derive the D-parameter. Otherwise, the numerical differentiation blows up the noise. Denoising with the Gauss-Hermite filter effectively denoises these spectra. Furthermore, differentiation of this signal is then easily performed by differentiating (and summing the derivatives) of the individual harmonics associated with the discrete Fourier transform. HAXPES represents an important direction in modern XPS. However, the cross sections for photoemission are often considerably lower with the higher energy X-rays used in HAXPES. Accordingly, there are advantages associated with the denoising of these spectra. Finally, EasyEXAFS is an important, laboratory-based way of performing EXAFS. However, again, in a number of cases, the signal-to-noise ratios of the signals are low. Again, the denoising of these signals is advantageous. In addition to demonstrating these new capabilities, we provide guidelines and cautions for using them.

9:00 AM Invited AS-ThM-5 ASSD Student Award Finalist Talk: Stitching, Stacking and Multilayering: Practical Evaluation of ToF-SIMS Data with Machine Learning
Sarah Bamford, Wil Gardner, David Winkler (La Trobe University); Benjamin Muir (CSIRO Materials Science and Engineering, Australia); Paul Pigram (La Trobe University)

Time of flight secondary ion mass spectrometry (ToF-SIMS) is a powerful analytical technique capable of collecting mass spectral information in one, two and three spatial dimensions. ToF-SIMS images and depth profiles are large and complex hyperspectral data sets. Interpretation requires that the complexity of these data sets is reduced. For two-dimensional (2D) data, individual ion peaks are often extracted and overlaid or for three-dimensional (3D) data, a selection of characteristic peaks are plotted in one dimension as a function of depth. These well-established methods are ideal for known or simple samples. However, for complex or unknown samples, these methods struggle to convey the depth of information captured within the data set. Furthermore, the choice of displayed ion peaks has the potential to impart user bias and make a significant difference to the interpretation of results.

Unsupervised machine learning, specifically self-organizing maps with relational perspective mapping (SOM-RPM), allows considered analysis of complex unknown samples. The SOM-RPM approach creates a color-coded similarity map in which changes in color are specifically graded to accord with changes in molecular state, by examining the totality of the data set. By pairing ToF-SIMS and SOM-RPM, the complete hyperspectral data set in 2D or 3D can be intuitively visualized, providing a unique picture of the local and global mass spectral relationships between individual pixels. The SOM-RPM methodology has proven to be a robust technique that offers substantial advantages in this field.

This work will present several case studies across a broad range of sample types including:

  • Depth profiling of a multilayer double silver low emissivity glass coating which considers the entire mass spectrum at every voxel and illuminates interfacial mixing.
  • Depth profiling of polyaniline films highlighting structural flaws such as pinholes as well as subtle changes in chemistry caused by heat treatment.
  • Identification of subtle changes in 200nm extracellular vesicles due to inflammatory response in releasing cells.
9:30 AM AS-ThM-7 Applications of Machine Learning in TOF SIMS Data Analysis: Classification and Quantitation
Lev Gelb, Amy Walker (University of Texas at Dallas)

We present progress towards analysis of TOF SIMS data using machine learning (ML) methods. We posit that TOF SIMS is not more widely used because the data is too complex to be interpreted without expert knowledge, and investigate how machine learning might help. We primarily train models on simulated “big” data sets constructed by combining and/or resampling experimental spectra, with a focus on neural-network architectures. A particular interest is in uncertainty quantification and evaluation of the reliability of the models.

In the case of classification, models are constructed that can identify the type of material studied from a TOF SIMS spectrum. These are trained on libraries of reference data. Of particular interest in this application is the number of reference data that are required and how that varies with the number of material types to be distinguished.

We also consider determining the composition of a homogeneous sample consisting of two or more components for which reference spectra are available. That is, the sample consists of compounds which appear in some reference library, and the algorithm should identify what compounds are present and in what relative quantities. Factors complicating this kind of analysis include statistical noise, matrix effects, background, calibration error, and the likely case that the reference spectra were not taken under the same conditions (primary ion, ion energy, instrument manufacturer, etc.) as the data to be analyzed. Our approach is to generate a large number of simulated high-resolution TOF SIMS spectra of multicomponent samples, again based on TOF SIMS reference data. Complicating factors are also incorporated to varying degrees, and the resulting data sets are used to train the models. Model performance is then studied and related to spectrum quality, complexity and complicating factors.

9:45 AM AS-ThM-8 Dorothy M. and Earl S. Hoffman Awardee Talk/ASSD Student Award Finalist Talk: Advancements in Tracer Diffusion Modeling with ToF-SIMS Depth Profiling
Nicolas Molina, Andrei Dolocan, Gregory Rodin, Filippo Mangolini (The University of Texas at Austin)
Time-of-flight secondary ion mass spectrometry (ToF-SIMS) stands as a robust analytical characterization technique for determining the depth distributions of chemical moieties within solids. One of its main advantages is its capability to acquire and differentiate between the characteristic signals of isotopes of the same chemical element during a measurement. As such, stable isotopes have been employed as diffusion sources in tracer diffusion studies in a variety of fields, from energy storage to failure analysis. The standard methodology for these tracer diffusion experiments includes sample preparation, isotope dosing, ToF-SIMS depth profiling, and diffusion modeling. In this presentation, we highlight current shortcomings when modeling ToF-SIMS depth profiles in tracer diffusion experiments and proposed improvements to overcome them, namely: 1) incorporation of the spatio-temporal experimental path of the diffusing isotope, which is crucial when ToF-SIMS acquisition and diffusion timescales are comparable (i.e., the depth profile cannot be considered frozen in time anymore) and allows for correcting the calculated transport constants, i.e., diffusivity (D) and surface exchange coefficient (Γ); 2) refinement of Fick’s diffusion equations when the isotopic abundance (i.e., 2H / (1H + 2H)) is used as a proxy for concentration instead of the isotope intensity (e.g., 2H), as classical Fick's laws along with their boundary conditions do not hold for the isotopic abundance. These advancements in diffusion modeling open the path for the accurate use of ToF-SIMS for quantifying transport constants in tracer diffusion studies across a broad range of applications, including in those involving the use of ultrathin films.
11:00 AM Invited AS-ThM-13 ASSD Peter Sherwood Award Talk: Hybrid SIMS: Evolution of a SIMS Instrument Combining Time-of-Flight and Orbital Trapping Mass Spectrometry
Alexander Pirkl (ION-TOF GmbH, Germany)

Secondary ion mass spectrometry (SIMS) offers the possibility to acquire laterally resolved chemical information from submicron regions on inorganic and organic samples. However, SIMS analysers usually lack the required mass resolution, mass accuracy and high resolution MS/MS capabilities required for the thorough investigation of complex biological materials.

To specifically address the imaging requirements in the life science field and following the original idea of Prof Ian Gilmore (NPL), a Hybrid SIMS instrument was developed in a research project by IONTOF and Thermo Fisher Scientific in close cooperation with other partners of the 3D OrbiSIMS project [1]. The instrument combines an Orbitrap-based Q Exactive HF mass analyser with a high-end ToF-SIMS system, providing a mass resolution > 240,000 and a mass accuracy < 1 ppm in conjunction with high lateral resolution cluster SIMS imaging capabilities. In this contribution we will put a spotlight on the different development steps involved in the initial project and later during its still ongoing evolution, always aiming at making this technology available to more applications.

[1] Passarelli, et al. Nature Methods volume 14, pp 1175–1183 (2017)

11:30 AM AS-ThM-15 Multiplexing Analysis Using Microarray Plate for Fast Analysis by ToF-SIMS
Tanguy Terlier, Carlos Gramajo (Rice University)

Multiplexing analysis is a type of assay that permits to characterize complex samples in a single run analysis. Here, we used a computer-controlled CO2 laser cutter for machining glass plates to pattern multiples 10-microns depth microwell. Among the most suitable characterization tool for producing multiplexing analysis, time-of-flight secondary ion mass spectrometry is a powerful surface analytical technique for providing detailed elemental and molecular information about the surface, thin layers, interfaces, and full 3D analysis of the samples.

Thanks to integrated large imaging capability by scanning an area of few mm2, ToF-SIMS characterization permits a deeper exploration and a better knowledge of the organic and biological materials with complex chemical structures. ToF-SIMS produces hyperspectral images where each individual pixel contains a full mass-range spectrum. Spot areas can be selected by generating a region of interest area to treat each individually each sample. Our glass plate permits to create a library of 120 individual samples. After identification of the characteristic fragment ions, multivariate analysis is used to establish the correlation between the molecular ions and to classify the relationship between the samples.

Beyond the combination of the hyperspectral images with MVA techniques, we can elucidate the chemical composition of a large set of specimens to address complex analytical challenges or chemical reactions. The operation consists of performing a rapid single-scan ToF-SIMS analysis of the plate and then classifying the dataset for database-matching or quantification of the composition. In an initial study, a small number of monomers were characterized and then we used the dataset to discriminate the molecular signature as function of the functional groups. The method has been extended to several lipids to establish a library of characteristic fragments for further understanding of lipid deposition profiles from foreign body responses. A second study has consisted of identifying the surface composition of antibody-conjugated gold nanoparticles. Our final example will focus on developing methods for inhibiting asphaltene deposition.

To conclude, we will demonstrate through various case studies how multiplexing analysis using microarray plate for fast analysis by ToF-SIMS can offer a rapid solution for building databases and establishing reference libraries of fragmented ions. In addition, this approach allows the classification and the correlation of chemical profiles from complex compounds, and potentially the quantification of mixture by dosing species ions from ToF-SIMS data.

11:45 AM AS-ThM-16 Exploring the Power of TOF-SIMS by Coupling Collision-Induced Dissociation with Surface-Induced Dissociation for Structural Analysis
Jacob Schmidt, Gregory Fisher (Physical Electronics USA)

TOF-SIMS with kilo-electron volt collision-induced dissociation (CID) tandem MS is a powerful tool for compositional identification and structural elucidation of molecules, metabolites, and degradation products due to its ability to isolate ions of interest and provide further insights into its molecular structure and composition. TOF-SIMS tandem MS has been used to unambiguously verify analytes and to generate 2D and 3D maps of a brass corrosion inhibitor1, to image organelles in single cells2, and to discern the degradation pathway of OLEDs3.

In this presentation, we will explore the application of surface-induced dissociation (SID) coupled with CID, to assist in the confirmation of molecular assignments. In contrast to CID, which promotes cleavage at every molecular bond, SID is more subtle in that the bond cleavages result predominantly in the observation of functional group chemistry, as shown in Figure 1. The fragmentation energetics between SID and CID are distinct, even at the same kinetic energy, which leads to pronounced effects on mass calibration. This difference in fragmentation energies can have a significant effect on calibration, which we will address using isotopic abundancies to confirm compositional assignments.

  1. M. Finšgar, Corrosion Science. 182 (2021) 109269.
  2. P. Agüi-Gonzalez, et. al, J. Anal. At. Spectrom. 34 (2019) 1355-1368
  3. K. Sawada, et. al, SID Symp. Dig. Tec. 54 (2023) 1291-1293
View Supplemental Document (pdf)
12:00 PM AS-ThM-17 Dealing with Reproducibility and Replication Challenges in Surface Analysis: Sample Provenance Information, Parameter Reporting, and Cultural Issues
Donald Baer (Pacific Northwest National Laboratory)

It has been increasingly recognized that in many areas of surface and materials science faulty data and analyses, along with sparse reporting of experimental and analysis details, complicate the ability of other researchers to assess and replicate published results. In the past several years, the surface analysis community has initiated multiple efforts to address these issues, mostly providing information and guidance to help researches appropriately collect and analyze data. The status of some of these efforts will be reported during this talk with a focus on topics related to collecting sample provenance information, methods to facilitate parameter collection and reporting, and the importance of addressing cultural issues. The increased use of surface methods by both inexperienced analysts and use of the methods outside the surface analysis community contribute to the problem. Therefore a variety of guides, tutorials and helpful websites have been and are being prepared including two collections of papers on reproducibility challenges and solutions published in the Journal of Vacuum Science and Technology A, a series of Notes and Insight papers in Surface and Interface Analysis, with the objective of providing short focused discussions on important topics, a discussion of common XPS errors and parameter reporting in Applied Surface Science Reports, and the information in the websites XPSlibrary.com and XPSOasis.org. The ISO Committee TC201 on Surface Chemical Analysis is developing a standard on reporting information regarding preparation of samples for surface analysis.This information is important for a sample provenance record that provides a traceable history of samples undergoing analysis. Recent reports on the lack of information about instruments used for data collection and analysis will be partially addressed by new publications in Surface Science Spectra providing referenceable details about widely used commercial instruments along with descriptions of operator selected modes of instrument operation. Finally, these efforts will be effective only to the extent that the members of the community accept, use, and promote the use of these tools and concepts.Specifically, it is important that the culture of the community values and supports efforts to enhance research quality. There are important roles for reviewers, colleagues, educators, editors, professional societies, and granting agencies in identifying and addressing sloppy science.

Session Abstract Book
(326KB, Oct 31, 2024)
Time Period ThM Sessions | Abstract Timeline | Topic AS Sessions | Time Periods | Topics | AVS 70 Schedule