Hi Nina,

Thank you for bringing your question to the list.

As others have said, the problem lies with not having enough data to normalize effectively, and is compounded by not taking control of the normalization process to make sure its relatively consistent. Standard A, for instance, has a post-edge line that cuts very low through the data, as does D. The 1:1 A to D calculated mixture, though, comes across more evenly.

This is a problem even with calculated mixtures, as you chose to import the calculated mixture as a mu(E) file, and not as a norm(E) file. In theory, this should be fixable by fixing the edge step of the calculated spectrum to 1, so that it doesn't try to normalize it again. Something's not quite right when I do that, because the calculated average of A and D sometimes slips below both of the other graphs, and an average shouldn't do that. Nevertheless, it gets us close: an LCF fit now gives us 54.5% A and the rest D.

Another approach is to remove the "force weights to 1" checkbox. This is often necessary if normalizations are in doubt. That works quite well here, delivering an A to B ratio of 49 to 53.

Summary:

--It's best to collect data far enough above the edge so that you establish an unambiguous post-edge trendline, if possible.

--Post-edge lines should be examined, and changed if they are inconsistent between spectra being used for LCF. It's better to just eyeball normalization than to use radically different trendlines, for instance. I sometimes play around by eye with trendlines to see what range of normalizations they give, and incorporate that in to my final reported uncertainties.

--If normalizations are difficult for a particular set of spectra, it is often better to remove the requirement that weights sum to 1. To the degree that normalizations are off, there will be some error in the values that are found, but at least the fitting routine is able to try to compensate for normalization differences by adjusting the weights. In other words, forcing the sum to be 1 when the normalizations are different forces a bad fit. Allowing them to total to anything allows the algorithm to transfer errors in normalization to errors in weighting.

Hope that helps!

--Scott Calvin
Sarah Lawrence College

On Aug 15, 2011, at 5:40 AM, Nina Siebers wrote:

Dear All,

I acquired Cd L3-edge spectra of some binary and ternary mixtures in  
varying proportions and for the individual components. The mixtures  
were created on Cd-mass basis. Then, I tried to fit the reference  
spectra to the spectra of the mixtures using linear combination  
fitting of Athena to get their abundance. However, the results were  
disappointing despite all spectra were carefully energy calibrated and  
normalized, so I decided to create simple mathematical binary and  
ternary mixtures by summing up the spectra of the individual reference  
spectra. After that I did an edge-step normalization in excel and  
imported the normalized calculated mixtures into Athena. Then, I tried  
the fitting again to exclude mixing-failures and check sensitivity of  
LCF with the idealized spectra. Even though the results of the LCF of  
the mathematical mixtures were better compared to the real mixtures,  
LCF was also not able to reliable deconvolute these spectra into the  
individual reference spectra.

Does anybody have an explanation for that? It would be nice if  
somebody could give me information about the mathematical fitting  
algorithm implemented in Athena.

Attached is a data file of three mixtures (two ternary and one binary  
mixture) including the mathematical mixture created in excel (named  
calculated at the end). Mixing ratios are named 1to1to1 (meaning 1:1:1  
of the components in the same order). For the 1:1:1 ternary  
mathematical mixture the deconvolution was very good, but the others  
need improvement.

I hope I made my problem clear this time.

Thanks a lot!
Wishes,
Nina


<Mixtures.prj><ATT00001..txt>