I agree about the first point, but what's done is done and the user who started this discussion has data which he needs to do something with. What's Larch? Why can't ifeffit do what I suggest? Is it because it uses an algorithm which requires uniform tabulation? One issue with my proposal - post-edge background subtraction (and even pre-edge if the Braggie comes at the wrong place) could be corrupted by gaps in the data. Imagine if a spline knot falls in a wide gap. An example of an effect similar to Bragg peaks but not fixable by screwing with data acquisition - 2-electron peaks in ekements like Ce. For those, you just have to subtract the peaks, but fortunately there is literature on how to do that. mam On 12/12/2012 3:06 PM, Bruce Ravel wrote:
On Wednesday, December 12, 2012 03:00:41 PM Matthew Marcus wrote:
Hmm. Another possible way to do it is to delete the bad data points and then do a "slow" FT, which would be a fit of the data, at the points given with no interpolation, to a sum of sines and cosines. This would have the nice feature of using the data as it is and ignoring the bad stuff. Filtering would involve multiplying the sines and cosines by some window function (in R space) and evaluating them *at the given k-points*, not on a regular grid. This of course means that evaluating FEFF paths and the like is likely to be slow because you don't get to use recursion relations to evaluate sin(2*R*k(i)+delta) as you would if k(i) are uniformly tabulated. Now that computers are a bazillion times faster than they were when EXAFS analysis traditions were established, maybe that's the way to go. What do you think?
I think data should be measured correctly in the first place if at all possible! ;)
That said, your suggestion seems completely valid to me. And, just to be clear, it's completely impossible with Ifeffit. Larch, on the other hand....
B