Loading…

Investigation of a method for the correction of self-absorption by Planck function in laser induced breakdown spectroscopy

The electron density and temperature of a laser-induced plasma can be determined from the width and intensity of the spectral lines, provided that the corresponding optical transitions are optically thin. However, the lines in laser induced plasma are often self-absorbed. One of the methods of corre...

Full description

Saved in:
Bibliographic Details
Published in:Journal of analytical atomic spectrometry 2023-04, Vol.38 (4), p.911-916
Main Authors: Vlker, Tobias, Gornushkin, Igor B
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The electron density and temperature of a laser-induced plasma can be determined from the width and intensity of the spectral lines, provided that the corresponding optical transitions are optically thin. However, the lines in laser induced plasma are often self-absorbed. One of the methods of correction of this effect is based on the use of the Planck function and an iterative numerical calculation of the plasma temperature. In this study, the method is further explored and its inherent errors and limitations are evaluated. For this, synthetic spectra are used that fully correspond to the assumed conditions of a homogeneous isothermal plasma at local thermodynamic equilibrium. Based on the error analysis, the advantages and disadvantages of the method are discussed in comparison with other methods of self-absorption correction. Self-absorption correction of LIBS spectra using the Planck function or equivalently the plasma temperature.
ISSN:0267-9477
1364-5544
DOI:10.1039/d2ja00352j