[Ifeffit] How to do with diffraction peak

Matt Newville newville at cars.uchicago.edu
Wed Dec 12 22:03:23 CST 2012


Hi Matthew, Bruce, All,

Sorry for not being able to join this discussion earlier.  I agree
that having glitch-free data is preferred. But I also think it's OK to
simply remove diffraction peaks from XAFS data -- you're just
asserting that you know those values are not good measurements of
mu(E).  I don't see it as all that different from removing obvious
outliers from fluorescence channels from a multi-element detector --
though that has the appeal of keeping good measurements at a
particular energy.

On Wed, Dec 12, 2012 at 5:06 PM, Bruce Ravel <bravel at bnl.gov> wrote:
> On Wednesday, December 12, 2012 03:00:41 PM Matthew Marcus wrote:
>> Hmm.  Another possible way to do it is to delete the bad data points and
>> then do a "slow" FT, which would be a fit of  the data, at the points given
>> with no interpolation, to a sum of sines and cosines.  This would have the
>> nice feature of using the data as it is and ignoring the bad
>> stuff.  Filtering would involve multiplying  the sines and cosines by some
>> window function (in R space) and evaluating them *at the given k-points*,
>> not on a regular grid.  This of course means that evaluating FEFF paths and
>> the like is likely to be slow because you don't get to use recursion
>> relations to evaluate sin(2*R*k(i)+delta) as you would if k(i) are
>> uniformly tabulated.  Now that computers are a bazillion times faster than
>> they were when EXAFS analysis traditions were established, maybe that's the
>> way to go.  What do you think?

I'm not sure that "slow FT" versus "discrete FT" is that important
here, though perhaps I'm missing your meaning.    My vies is that The
EXAFS signal is band-limited (finite k range due to F(k) and 1/k, and
finite R range due to F(k), lambda(k),  and 1/R^2), so  that sampling
on reasonably fine grids is going to preserve all the information that
is really there.

I do think that having richer window functions and spectral weightings
would be very useful.   You could view chi(k) data with glitches ias
wanting a window function that had a very small value (possibly zero:
remove this point) exactly at the glitch, but had a large value (~1)
at nearby k-values.

Another (and possibly best) approach is to assign an uncertainty to
each k value of chi.  At the glitch, the uncertainty needs to go way
up, so that not matching that point does not harm the overall fit.

Larch has a separate Transform class used for each dataset in a fit -
this includes the standard FT parameters and fit ranges.  It is
intended to be able to do this sort of advanced windowing (and in
principle, advanced weighting too).  Currently (I'd be ready for a
release but am commissioning our new beamline and am hoping to post a
windows installer by the end of the month)  this has "the standard
XAFS window functions", but one can create their own window functions
(for k- and/or R-space) with suppressed points/regions, etc, and use
them as well.  I have to admit I haven't tried this, but it was
definitely part of the intention, and should work.   My hope is to
extend this to include some wavelet-like weightings as well.

--Matt



More information about the Ifeffit mailing list