Basic question in normalization and background removal
Hello, I have here a very basic XANES data processing. I am but a novice in this XANES analysis. In Bruce's lecture "Athena: Data processing Imms://diamstream02.diamond.ac.uk/xafs-2011-2" posted in the Diamond Light Source website, he used the Fe_lepidocrocite.000 example to show how background removal and normalization is done. For the normalization range he used the range 150 - 742.610 because his original photon energy scan range is quite long [ Emin: -200 and Emax: 800]. However, say that your original scan range is only short e.g. within the range Emin : -30 to Emax: 80, such that you don't see the non-wiggly range and you only see the a short portion of the post edge range, how does one set the normalization range for this data? Since I'm very new to XANES analysis, I keep second-guessing my choice of normalization range. Your patience for answering this simple question will definitely help me in moving to the actual fingerprinting analysis that I need to do with my sample. Many thanks. Warm regards, Cyril [cid:image004.jpg@01CE05DE.B8F89320] [cid:image007.jpg@01CE05DE.B8F89320]
Dear Cyril, I think you should read about normalization of XANES (NEXAFS) spectra at soft X-rays. It is a common problem of normalization with the short post-edge region. There are few tricks. I think you can find answer in already published articles (as an example) or on ifeffit mailing list (there are few already discussed topics) regards kicaj W dniu 13-02-08 08:28, Bajamundi Cyril pisze:
Hello,
I have here a very basic XANES data processing. I am but a novice in this XANES analysis.
In Bruce’s lecture “Athena: Data processing I mms://diamstream02.diamond.ac.uk/xafs-2011-2” posted in the Diamond Light Source website, he used the Fe_lepidocrocite.000 example to show how background removal and normalization is done. For the normalization range he used the range 150 – 742.610 because his original photon energy scan range is quite long [ Emin: -200 and Emax: 800].
However, say that your original scan range is only short e.g. within the range Emin : -30 to Emax: 80, such that you don’t see the non-wiggly range and you only see the a short portion of the post edge range, how does one set the normalization range for this data? Since I’m very new to XANES analysis, I keep second-guessing my choice of normalization range.
Your patience for answering this simple question will definitely help me in moving to the actual fingerprinting analysis that I need to do with my sample.
Many thanks.
Warm regards,
Cyril
_______________________________________________ Ifeffit mailing list Ifeffit@millenia.cars.aps.anl.gov http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit
Cyril, My favorite answer is "don't measure data that way". At the very least, add a few widely spaced points in the pre- and post edge regions so that normalization can be done. If you added 10 points spaced 20 volts apart to the data on the right, then normalization would be much less ambiguous. The concept of the normalization algorithm is that the fine structure eventually damps away and all spectra approach the bare atoms spectrum. In short, all data eventually approach an exponentially decaying edge step. If you only measure a few volts above the edge, you have no clue how your specific data relates to that decaying edge step, thus you don't really know how to do the normalization. The problem is compounded by the fact that shape of the actual measured data is affected by how it is measured. If you somehow prepare the same material in a way that is suitable for high quality transmission and fluorescence, the raw data will look rather different -- the transmission data will tend to slope downwards from the beginning of the data to their end, while the fluorescence data will tend to slope upwards. And a fluorescence sample with relatively high concentration will look different from the same material with relatively low concentration due to the relative sizes of the fluorescence line and the Compton/elastic contribution. And data measured with an integrating detector will look somewhat different from data measured with an energy discriminating detector. If you measure enough data to do a good job of normalization, all of those differences can be handled in a defensible way. If you do not, well .... While the correct answer is "don't do that", it would seem that the reality is "you've already done that". So what now? Someone may suggest using an algorithm that compare the measured data to tabulated values of the bare atom absorption. Ifeffit does this, but that feature is not currently enabled in Athena. (It's on the to-do list, FWTW) Or do a google scholar search for the paper about "MBACK". That will sort of solve the problem in the sense that it offers a somewhat more stable way of dealing with short-range data. But it's not a perfect solution for all the same reasons I described above. An empirical solution would be to measure at least one proper spectrum (i.e. with a decent pre- and post-edge range). Then play around with the normalization parameters until your short-range data on that sample looks like your long-range data on that same sample. Then use that set of normalization parameters for your entire ensemble of short-range data and keep your fingers crossed that it works across the ensemble. I understand the motivation to measure very short-range data. You are doing a time-resolved experiment, or you have a bazillion samples to measure -- one of the standard reasons. The thing you must ask yourself is whether you will be able to interpret reliably and defensibly the data and the quantity of data that you measure. "Yes" means *you* (and I really mean you, not Athena ... you) have a reliable way of overcoming the limitation that you are imposing upon the set of measurement. "No" means you need to perform the measurement differently next time. Good luck, B ________________________________________ From: ifeffit-bounces@millenia.cars.aps.anl.gov [ifeffit-bounces@millenia.cars.aps.anl.gov] on behalf of Bajamundi Cyril [Cyril.Bajamundi@vtt.fi] Sent: Friday, February 08, 2013 2:28 AM To: ifeffit@millenia.cars.aps.anl.gov Subject: [Ifeffit] Basic question in normalization and background removal Hello, I have here a very basic XANES data processing. I am but a novice in this XANES analysis. In Bruce’s lecture “Athena: Data processing Imms://diamstream02.diamond.ac.uk/xafs-2011-2” posted in the Diamond Light Source website, he used the Fe_lepidocrocite.000 example to show how background removal and normalization is done. For the normalization range he used the range 150 – 742.610 because his original photon energy scan range is quite long [ Emin: -200 and Emax: 800]. However, say that your original scan range is only short e.g. within the range Emin : -30 to Emax: 80, such that you don’t see the non-wiggly range and you only see the a short portion of the post edge range, how does one set the normalization range for this data? Since I’m very new to XANES analysis, I keep second-guessing my choice of normalization range. Your patience for answering this simple question will definitely help me in moving to the actual fingerprinting analysis that I need to do with my sample. Many thanks. Warm regards, Cyril [cid:image004.jpg@01CE05DE.B8F89320] [cid:image007.jpg@01CE05DE.B8F89320]
Hi Bruce, Thank you for a very clear cut explanation on the importance of the having a longer pre and post edge data. I shall put this in mind. For a given sample, I actually have (1) a single scan that covers the energy range -200 to 600 eV relative to E0, and (2) triplicates scans that covers the range of -30 to 80 eV relative to E0. So I guess your empirical suggestion would then be applicable. " An empirical solution would be to measure at least one proper spectrum (i.e. with a decent pre- and post-edge range). Then play around with the normalization parameters until your short-range data on that sample looks like your long-range data on that same sample." To give you a little background, I intend to use the spectrum of the merge of the triplicates in the LCF to reduce the noise in my data, but as I have said, I have no defensible and non-arbitrary way of setting the normalization range. My first naïve approach to the approximation of the normalization range is to choose a range such that the post-edge line is parallel with pre-edge line, this in my novice eyes this seems to be the case between these line two lines, at least in the XANES range. This observation came from playing with some spectra in the databases I found online. May I know the pitfalls of this approach? Thank you very much. Warm regards, Cyril -----Original Message----- From: ifeffit-bounces@millenia.cars.aps.anl.gov [mailto:ifeffit-bounces@millenia.cars.aps.anl.gov] On Behalf Of Ravel, Bruce Sent: Friday, February 08, 2013 2:17 PM To: XAFS Analysis using Ifeffit Subject: Re: [Ifeffit] Basic question in normalization and background removal Cyril, My favorite answer is "don't measure data that way". At the very least, add a few widely spaced points in the pre- and post edge regions so that normalization can be done. If you added 10 points spaced 20 volts apart to the data on the right, then normalization would be much less ambiguous. The concept of the normalization algorithm is that the fine structure eventually damps away and all spectra approach the bare atoms spectrum. In short, all data eventually approach an exponentially decaying edge step. If you only measure a few volts above the edge, you have no clue how your specific data relates to that decaying edge step, thus you don't really know how to do the normalization. The problem is compounded by the fact that shape of the actual measured data is affected by how it is measured. If you somehow prepare the same material in a way that is suitable for high quality transmission and fluorescence, the raw data will look rather different -- the transmission data will tend to slope downwards from the beginning of the data to their end, while the fluorescence data will tend to slope upwards. And a fluorescence sample with relatively high concentration will look different from the same material with relatively low concentration due to the relative sizes of the fluorescence line and the Compton/elastic contribution. And data measured with an integrating detector will look somewhat different from data measured with an energy discriminating detector. If you measure enough data to do a good job of normalization, all of those differences can be handled in a defensible way. If you do not, well .... While the correct answer is "don't do that", it would seem that the reality is "you've already done that". So what now? Someone may suggest using an algorithm that compare the measured data to tabulated values of the bare atom absorption. Ifeffit does this, but that feature is not currently enabled in Athena. (It's on the to-do list, FWTW) Or do a google scholar search for the paper about "MBACK". That will sort of solve the problem in the sense that it offers a somewhat more stable way of dealing with short-range data. But it's not a perfect solution for all the same reasons I described above. An empirical solution would be to measure at least one proper spectrum (i.e. with a decent pre- and post-edge range). Then play around with the normalization parameters until your short-range data on that sample looks like your long-range data on that same sample. Then use that set of normalization parameters for your entire ensemble of short-range data and keep your fingers crossed that it works across the ensemble. I understand the motivation to measure very short-range data. You are doing a time-resolved experiment, or you have a bazillion samples to measure -- one of the standard reasons. The thing you must ask yourself is whether you will be able to interpret reliably and defensibly the data and the quantity of data that you measure. "Yes" means *you* (and I really mean you, not Athena ... you) have a reliable way of overcoming the limitation that you are imposing upon the set of measurement. "No" means you need to perform the measurement differently next time. Good luck, B ________________________________________ From: ifeffit-bounces@millenia.cars.aps.anl.gov [ifeffit-bounces@millenia.cars.aps.anl.gov] on behalf of Bajamundi Cyril [Cyril.Bajamundi@vtt.fi] Sent: Friday, February 08, 2013 2:28 AM To: ifeffit@millenia.cars.aps.anl.gov Subject: [Ifeffit] Basic question in normalization and background removal Hello, I have here a very basic XANES data processing. I am but a novice in this XANES analysis. In Bruce's lecture "Athena: Data processing Imms://diamstream02.diamond.ac.uk/xafs-2011-2" posted in the Diamond Light Source website, he used the Fe_lepidocrocite.000 example to show how background removal and normalization is done. For the normalization range he used the range 150 - 742.610 because his original photon energy scan range is quite long [ Emin: -200 and Emax: 800]. However, say that your original scan range is only short e.g. within the range Emin : -30 to Emax: 80, such that you don't see the non-wiggly range and you only see the a short portion of the post edge range, how does one set the normalization range for this data? Since I'm very new to XANES analysis, I keep second-guessing my choice of normalization range. Your patience for answering this simple question will definitely help me in moving to the actual fingerprinting analysis that I need to do with my sample. Many thanks. Warm regards, Cyril [cid:image004.jpg@01CE05DE.B8F89320] [cid:image007.jpg@01CE05DE.B8F89320] _______________________________________________ Ifeffit mailing list Ifeffit@millenia.cars.aps.anl.gov http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit
On Friday, February 08, 2013 01:22:14 PM Bajamundi Cyril wrote:
My first naïve approach to the approximation of the normalization range is to choose a range such that the post-edge line is parallel with pre-edge line, this in my novice eyes this seems to be the case between these line two lines, at least in the XANES range. This observation came from playing with some spectra in the databases I found online. May I know the pitfalls of this approach?
Pre- and post-edge lines are sometimes parallel, but not in general. As I said, the details of the sample and the method of measurement have a lot to do with what the data look like. I really cannot answer your question in general. What you suggest may work for you, it may not. At the end of the day, the defensibility of an analysis is the bottom line. Can you convince your reader (or thesis advisor or potential boss in a job interview or whatever) that your intepretation of the analysis of your data is correct? In the context of linear combination fitting, you have another tool at your disposal. You can prepare defined mixtures of your standards and measure those the same way youmeasure your actual data. If your normalization procedure and LCF analysis works on the defined mixtures and gives an answer that is consistent within its uncertainty with how you made the mixtures, that gives you confidence that you can correctly analyze the unknowns. Of course that means spending an hour or two at the beamline measuring those defined mixtures. That takes time away from measuring the real samples. But if you end up in a better position to interpret the real data, that seems like a win to me. B -- Bruce Ravel ------------------------------------ bravel@bnl.gov National Institute of Standards and Technology Synchrotron Methods Group at NSLS --- Beamlines U7A, X24A, X23A2 Building 535A Upton NY, 11973 Homepage: http://xafs.org/BruceRavel Software: https://github.com/bruceravel
Thank you Bruce. I'll take your suggestions into consideration. Warm regards, Cyril -----Original Message----- From: ifeffit-bounces@millenia.cars.aps.anl.gov [mailto:ifeffit-bounces@millenia.cars.aps.anl.gov] On Behalf Of Bruce Ravel Sent: Friday, February 08, 2013 3:45 PM To: XAFS Analysis using Ifeffit Subject: Re: [Ifeffit] Basic question in normalization and background removal On Friday, February 08, 2013 01:22:14 PM Bajamundi Cyril wrote:
My first naïve approach to the approximation of the normalization range is to choose a range such that the post-edge line is parallel with pre-edge line, this in my novice eyes this seems to be the case between these line two lines, at least in the XANES range. This observation came from playing with some spectra in the databases I found online. May I know the pitfalls of this approach?
Pre- and post-edge lines are sometimes parallel, but not in general. As I said, the details of the sample and the method of measurement have a lot to do with what the data look like. I really cannot answer your question in general. What you suggest may work for you, it may not. At the end of the day, the defensibility of an analysis is the bottom line. Can you convince your reader (or thesis advisor or potential boss in a job interview or whatever) that your intepretation of the analysis of your data is correct? In the context of linear combination fitting, you have another tool at your disposal. You can prepare defined mixtures of your standards and measure those the same way youmeasure your actual data. If your normalization procedure and LCF analysis works on the defined mixtures and gives an answer that is consistent within its uncertainty with how you made the mixtures, that gives you confidence that you can correctly analyze the unknowns. Of course that means spending an hour or two at the beamline measuring those defined mixtures. That takes time away from measuring the real samples. But if you end up in a better position to interpret the real data, that seems like a win to me. B -- Bruce Ravel ------------------------------------ bravel@bnl.gov National Institute of Standards and Technology Synchrotron Methods Group at NSLS --- Beamlines U7A, X24A, X23A2 Building 535A Upton NY, 11973 Homepage: http://xafs.org/BruceRavel Software: https://github.com/bruceravel _______________________________________________ Ifeffit mailing list Ifeffit@millenia.cars.aps.anl.gov http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit
participants (4)
-
"Dr. Dariusz A. Zając"
-
Bajamundi Cyril
-
Bruce Ravel
-
Ravel, Bruce