[note: when putting together this post I discovered a possible bug regarding opening my project files on different PCs. I will submit that issue separately] Hi all: I am writing to probe the community's knowledge regarding my current problem in measuring S02. I have data on a series of glass samples containing ZnO. I also measured a ZnO crystalline powder, and a Zn foil was measured between detectors 2 and 3 (transmission mode) simultaneously with every run. My goal is to determine coordination numbers around Zn in the glass samples, which requires an accurate measurement of S02 for zinc. My main question is if there is any reason why the Zn foil would provide a more accurate result than the ZnO powder, or vice versa. I see no reason why it should be different, especially since this method depends entirely on the 'chemical transferability' of S02 in the first place. Nevertheless, I obtain different results from each, with S02 ~ 0.75 from the ZnO and ~0.88 from the foil. Statistically, the Zn foil dataset is "better" since it is a merge of all of my scans while there are only two scans of ZnO. However, part of me wants to trust the ZnO data more, for somewhat vague reasons such as that it is a "more similar" coordination environment to what I would expect in my samples, and that it was measured between the same two detectors, etc. If anyone has any experience that would indicate why one should be better than the other, I would love to hear about it. I will attach Artemis project files for each if you'd like to take a look at my fits. It is entirely possible that the discrepancy is due to an error in data processing or analysis on my part; I am by no means an expert. I hope the project files will not clog up anyone's mailbox, at a couple of megabytes each. Thanks, Jeremy