7 Jul
2003
7 Jul
'03
10:43 p.m.
Matt Newville writes:
Using a solid-state detector with low-concentration samples, it's common to do a couple scans counting for a few seconds per point, then more scans counting for longer time (say, first 3sec/pt then 10sec/pt). The data is typically better with longer counting time (not always by square-root-of-time), but you want to use all the noisy data you have. In such a case, a weighted average based on after-the-fact data quality would be useful.
This was the kind of example I was looking for. I agree that in this case it makes sense to use the high-R noise estimate for averaging, and thus it's useful to have this option implemented in software. --Scott