Hi, Dave and Matt, Sorry for late reply. I was attending a workshop at ALS last week. The conventional way for EXAFS scaling is to extrapolate the spline to the "edge energy" and set this pre-edge-background-free value as 1.0. chi(k) = [ mu(E) - mu0(E) ] / mu0(E) ~ [ mu(E) - mu_bkg(E) ] / mu_bkg(E) ~ [ mu(E) - mu_bkg(E) ] / mu_bkg(E0) where mu_bkg(E0) gives an edge jummp of one For transmission XAS data, using Victoreen or 2nd-order polynomials as background functions have been sufficient to give correct EXAFS for two decades. Since EXAFS is the fraction modulation in the absorption coefficient, improper EXAFS scaling might lead to different EXAFS fitting results. In few cases, this could be a problem. I could think of two scenarios: 1) fluorescence XAS data for dilute samples and 2) XAS has a large intensity in the principle peak, eg. lanthanides. In scenario 1, using a linear function to fit a large and curved pre- edge background would result in different EXAFS scaling. In scenario 2, the intensive peak would heavily affect the curvature of the spline and make it difficult to define a unique edge jump. We have XAS spectra for duplicate protein samples measured in two beamlines. Using stepwise normalization (pre-edge + post-edge), one gives ZnS4 and another gives ZnS3O while the x-ray crystal structure shows it's a ZnS4 site. If we use a conjunctional background (MBACK), both give ZnS4. I don't have any example for the scenario 2, but I have used MACK to normalize Gd L3/L3/L1-edges simultaneously and show the two Gd conformers have slightly different intensity in L3 peak and not in L2/ L1. I agree with Matt that bkg_() plus a better pre-edge background could perform the way as what MBACK does. The main application of MBACK would be XANES analysis. It's kind of irritating to see people normalize peak height to compare the edge shift in Mn or Fe. Tsu-Chien <Chien>001000</Chien> On Sep 29, 2005, at 5:00 PM, Matt Newville wrote:
Hi Dave,
Thanks for bringing this up. I've talked with Tsu-Chien about the MBACK procedure discussed in the JSR article, and about whether he thought it would be worth adding to Ifeffit. I interpreted our conversation to be that it wasn't a very high priority for either of us two to do the implementation soon. (I don't recall if Tsu-Chien is on this list, but I'd be happy to be corrected).
I do not mean to say that I think it should not be done. I do think the Cromer-Libermann background (bkg_cl) command in ifeffit is sort of close, but it does ignore any drifts in instrumentation response. And Ifeffit always does a linear pre-edge. I've come to understand (especially from people with very dilute samples and solid state detectors) that this can be a noticeable problem.
I think including the MBACK approach is a good idea, and would not want to discourage anyone from making that happen. But having an option for a Victoreen pre-edge, say, is probably even more important. I think that coupling a better pre-edge function with the Cromer-Libermann background might get 3/4 of the way to the MBACK approach.
I'd love to hear other opinions on this....
--Matt
On Thu, 29 Sep 2005, Barton, David (DG) wrote:
XANES Aficionados,
I know most users on this list are hard-core EXAFS analysts, but some of us use XANES analysis to decipher mixed phases or use it to get structural clues when data quality isn't sufficient for EXAFS analysis. Anyway, I have a question for those interested in quantitative analysis of XANES or those that are willing to offer an opinion on methods to normalize raw spectra to get reliable, repeatable comparisons of near-edge features.
Has anyone considered using the methods of Penner-Hahn to normalize their data to get reliable intensities of near-edge features? The reference to their method of normalization is below:
J. Synchrotron Rad. (2005). 12, 506-510 [doi:10.1107/ S0909049504034193] "A method for normalization of X-ray absorption spectra" T.-C. Weng, G. S. Waldo and J. E. Penner-Hahn
It is my understanding the Ifeffit uses a linear function to remove the pre-edge and a quadratic for the post-edge and that the Athena's flatten is a subtraction of these functions. This is an excellent method under most scenarios since the background is usually fairly smooth and can be well approximated by a quadratic equation. Personally, I have had only a few rare cases where the quadratic function on the post-edge was not sufficient to reliably normalize the data and in those cases a third-order polynomial was sufficient. Does anyone else have an opinion on using alternative normalization routines?
Dave _________________________________ David Barton The Dow Chemical Company
_______________________________________________ Ifeffit mailing list Ifeffit@millenia.cars.aps.anl.gov http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit
_______________________________________________ Ifeffit mailing list Ifeffit@millenia.cars.aps.anl.gov http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit