Hi Nina, As others have said, normalization is critical for getting reasonable answers from LCF. One of the drawbacks of a linear analysis is that it always gives an answer, even when the assumptions going into the question are poor.
Even though the results of the LCF of the mathematical mixtures were better compared to the real mixtures, LCF was also not able to reliable deconvolute these spectra into the individual reference spectra. Does anybody have an explanation for that?
I'm not sure what your tests entailed, but I would have expected that your 'A_D_1to1' or 'A_D_1to1_calculated' spectra would have been a 1-to-1 mixture of the A and D spectra. A 1-to-1 mixture would a) have several isobestic point (for all E where A(E)=D(E), all linear combinations of A and D have the same value), and b) be half-way between A(E) and D(E) where the difference is largest, say near E=3568 eV. From the attached plot (simply plotting the data from your project), it's pretty obvious that neither of these is true. This might be related to poor normalization, a simple calculation mistake, or maybe I misunderstand your intent.
It would be nice if somebody could give me information about the mathematical fitting algorithm implemented in Athena.
Non-linear least-squares, using the Levenberg-Marquardt algorithm. See http://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm and do feel free to read the docs: http://cars9.uchicago.edu/~ravel/software/doc/Athena/html/analysis/lcf.html http://cars9.uchicago.edu/~ifeffit/refman/node63.html Admittedly, Levenberg-Marquardt may not be the most obvious choice for linear analysis, but is usually quite robust and fast for finding optimal solutions for linear problems, and is needed for other parts of XAFS analysis. --Matt