[Ifeffit] Basic question in normalization and background removal

Bajamundi Cyril Cyril.Bajamundi at vtt.fi
Fri Feb 8 07:22:14 CST 2013


Hi Bruce, 

Thank you for a very clear cut explanation on the importance of the having a longer pre and post edge data. I shall put this in mind. 

For a given sample, I actually have 
 (1) a single scan that covers the energy range -200 to 600 eV relative to E0, and 
 (2) triplicates scans that covers the range of  -30 to 80 eV relative to E0.

So I guess your empirical suggestion would then be applicable. 
 " An empirical solution would be to measure at least one proper spectrum (i.e. with a decent pre- and post-edge range).   Then play around with the normalization parameters until your short-range data on that sample looks like your long-range data on that same sample."

To give you a little background, I intend to use the spectrum of the merge of the triplicates in the LCF to reduce the noise in my data, but as I have said, I have no defensible and non-arbitrary way of setting the normalization range. My first naïve approach to the approximation  of the normalization range is to choose a range such that the post-edge line is parallel with pre-edge line, this in my novice eyes this seems to be the case between these line two lines, at least in the XANES range. This observation came from playing with some spectra in the databases I found online.
May I know the pitfalls of this approach?

Thank you very much. 

Warm regards, Cyril 


-----Original Message-----
From: ifeffit-bounces at millenia.cars.aps.anl.gov [mailto:ifeffit-bounces at millenia.cars.aps.anl.gov] On Behalf Of Ravel, Bruce
Sent: Friday, February 08, 2013 2:17 PM
To: XAFS Analysis using Ifeffit
Subject: Re: [Ifeffit] Basic question in normalization and background removal


Cyril,

My favorite answer is "don't measure data that way".  At the very least, add a few widely spaced points in the pre- and post edge regions so that normalization can be done.  If you added 10 points spaced 20 volts apart to the data on the right, then normalization would be much less ambiguous.

The concept of the normalization algorithm is that the fine structure eventually damps away and all spectra approach the bare atoms spectrum.  In short, all data eventually approach an exponentially decaying edge step.

If you only measure a few volts above the edge, you have no clue how your specific data relates to that decaying edge step, thus you don't really know how to do the normalization.  The problem is compounded by the fact that shape of the actual measured data is affected by how it is measured.  If you somehow prepare the same material in a way that is suitable for high quality transmission and fluorescence, the raw data will look rather different -- the transmission data will tend to slope downwards from the beginning of the data to their end, while the fluorescence data will tend to slope upwards.  And a fluorescence sample with relatively high concentration will look different from the same material with relatively low concentration due to the relative sizes of the fluorescence line and the Compton/elastic contribution.  And data measured with an integrating detector will look somewhat different from data measured with an energy discriminating detector. 

If you measure enough data to do a good job of normalization, all of those differences can be handled in a defensible way.  If you do not, well ....

While the correct answer is "don't do that", it would seem that the reality is "you've already done that".  So what now?

Someone may suggest using an algorithm that compare the measured data to tabulated values of the bare atom absorption.  Ifeffit does this, but that feature is not currently enabled in Athena.  (It's on  the to-do list, FWTW)  Or do a google scholar search for the paper about "MBACK".  That will sort of solve the problem in the sense that it offers a somewhat more stable way of dealing with short-range data.  But it's not a perfect solution for all the same reasons I described above.

An empirical solution would be to measure at least one proper spectrum (i.e. with a decent pre- and post-edge range).  Then play around with the normalization parameters until your short-range data on that sample looks like your long-range data on that same sample.  Then use that set of normalization parameters for your entire ensemble of short-range data and keep your fingers crossed that it works across the ensemble.

I understand the motivation to measure very short-range data.  You are doing a time-resolved experiment, or you have a bazillion samples to measure  -- one of the standard reasons.  The thing you must ask yourself is whether you will be able to interpret reliably and defensibly the data and the quantity of data that you measure.  "Yes" means *you* (and I really mean you, not Athena ... you) have a reliable way of overcoming the limitation that you are imposing upon the set of measurement.  "No" means you need to perform the measurement differently next time.

Good luck,
B

________________________________________
From: ifeffit-bounces at millenia.cars.aps.anl.gov [ifeffit-bounces at millenia.cars.aps.anl.gov] on behalf of Bajamundi Cyril [Cyril.Bajamundi at vtt.fi]
Sent: Friday, February 08, 2013 2:28 AM
To: ifeffit at millenia.cars.aps.anl.gov
Subject: [Ifeffit] Basic question in normalization and background removal

Hello,

I have here a very basic XANES data processing. I am but a novice in this XANES analysis.

In Bruce's lecture  "Athena: Data processing I<mms://diamstream02.diamond.ac.uk/xafs-2011-2>" posted in the Diamond Light Source website, he used the Fe_lepidocrocite.000 example to show how background removal and normalization is done.  For the normalization range  he used the range 150 - 742.610 because his original photon energy scan range is quite long [ Emin: -200 and Emax: 800].

However, say that your original scan range is only short e.g. within the range Emin : -30 to Emax: 80, such that you don't see the non-wiggly range  and you only see the a short portion of the post edge range, how does one set the normalization range for this data? Since I'm very new to XANES analysis, I keep second-guessing my choice  of normalization range.

Your patience for answering this simple question will definitely help me in moving to the actual fingerprinting analysis that I need to do with my sample.
Many thanks.


Warm regards,
Cyril

[cid:image004.jpg at 01CE05DE.B8F89320] [cid:image007.jpg at 01CE05DE.B8F89320]


_______________________________________________
Ifeffit mailing list
Ifeffit at millenia.cars.aps.anl.gov
http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit




More information about the Ifeffit mailing list