On Fri, Jul 13, 2018 at 1:27 PM Macon Abernathy
Hey all,
Has anyone had success loading data from larger pixel arrays into Larch? I have data collected on a 100 element Ge detector that results in Athena crashing upon import (running on Mac OS 10.13.4, Demeter v9.25). I figure this is a good task for Larch, but am having a difficult time figuring out how to import all 100 channels for each data set simultaneously.
Thanks in advance for any tips,
What format is the data in? Larch can read very large files, including HDF5 and netCDF files. I don't know anyone using those formats for XAFS data, but I use them all the time for fluorescence maps, where the files can easily get to be 10s of Gb. But most XAFS data can be held in ASCII files, even for 100 element detectors. If the data is in an ASCII file, something like fdata = read_ascii('MyDataFile.txt') might just work, and will give you a 2D array in 'fdata.data' that has a shape of (ncolumns, n_energy_points). Read_ascii() will also try to parse the column labels, which may work for some data files, but might not work great for all data files. With the full 2D data table you might be able to do something like (totally making up the array indices): fdata.energy = fdata.data[0, :] fdata.i0 = fdata[2, :] fdata.fluor = fdata.data[4:104, :].sum(axis=0) # sum columns 5 to 104 fdata.mu = fdata.fluor / fdata.i0 If something like that doesn't work, send a data file and we'll figure it out. --Matt