Hi Everyone, I am facing a lot of problems when I am trying to fit my sample's spectra with my standards one. Initially, I used Athena to do the background subtraction, but the fitting results are just too "funny". The fitting does not really fit the actual spectra. Since I am analyzing Sulphur and many papers has reported how it is prone to self-absorption effect, I thought, probably if I am using a better background subtraction such as MBACK, I would get a better results. So I decided to use it for the normalization and unlike in Athena, the normalized intensity of the spectra does not equal to 1. By saying this, the intensity of my standard models is way higher than the actual samples. Hence, when I use Athena to fit my real spectra, I would not get a good results. I have also tried using the 3rd derivative of my absorption spectra, but the fitting results are even worse. The intensity of the "fit" spectra is way higher than the actual spectra of my samples that I am trying to fit it with. In fact, when I stack the spectra from the standards and the sample together, the 3rd derivative peak of the samples is almost completely flat due to the very high intensity of the standards themselves. What did I do wrong? and what can I do to fix it? I have also attached my samples and standards spectra. Thank you very much for the assistance. Niken