[Ifeffit] care and feeding of Athena
newville at cars.uchicago.edu
Wed May 9 14:49:26 CDT 2007
>From seeing the project file that was causing problems, I think the
problem was not an Ifeffit memory issue, but a problem with Artemis
reading chi(k) from Athena projects that uses a standard in background
That's not to say that memory limitations aren't real -- I've
definitely seen corrupted data with large sets of data. But I don't
think the project in question is at that limit.
> It occurs to me that a better way would be for Ifeffit (or Athena) to
> keep track of how much memory swapping is going on. Once it reaches
> some large but safe value, a warning is sent to the user. In this
> scenario the deleting and reading in a bunch of new groups would also
> trigger the warning. This seems like it would be hard to implement =
> negative 2cents.
Ifeffit does keep track of (and Athena / Artemis use) how much of its
internal memory for array data is used (with &max_heap and
&heap_free). The project in question, with ~150 groups of data, uses
5% of the heap.
We often run Athena using considerably more internal data than this
example (with multi-element detectors), and often swap out projects
without trouble. These actions work the heap space much more than
reading in one project with ~150 groups of mu(E) data. My "torture
tests" for Ifeffit has non-corrupted data (though it definitely slows
down!) with more than 90% of the heap space being used (with ~1000
arrays of data). There were definitely problems in the past with
erasing and moving lots of groups around, and I think these may not
all be resolved....
Personally, I'd like to study problem cases (still trying to produce
a simple test case) before deciding the details. I don't see a real
benefit to having the max number of groups being a user setting,
especially as the number of groups is not going to be the actual
More information about the Ifeffit