Hello to the PolyML users mailing list.
I'm an Isabelle/HOL user from the L4.verified project at NICTA, and we're having some issues with Isabelle using far too much memory. I have a feature request that would be useful in diagnosing the problem.
The current implementation of PolyML provides an object profiling mechanism PolyML.objProfile which analyses memory consumed by a given object, decomposed by size. As a trivial issue, there doesn't seem to be a good way to profile all the heap, since threading may make some objects inaccessible from the current root function.
The big issue is that the profile doesn't tell me anything useful at all.
ML> PolyML.objProfile (Thy_Info.get_theory "Nat");
Immutable object sizes and counts 1 229872 2 2557083 3 756596 4 1283875 5 811230 6 24521 7 102914 8 9200 9 9452 10 20660 ...
The development is too big for me to have a chance of identifying which objects of size 2, 4 and 5 are consuming all the memory. By comparison, an object profile could be really valuable if it were decomposed by type (or possible type and/or closure). This would hopefully make it straightforward to hunt down the worst-offending modules.
On a related note, Don Stewart gave a talk about his experience in Haskell working at Galois recently, and pointed out that profiling by types in their runtime effectively eliminated their problems with the famous Haskell space-leak issue.
It seems to me this would be roughly compatible with the existing objProfile algorithm. I had a look at the code and got as far as realising I had no hope of understanding how PolyML encodes types internally. Does this seem like a reasonable feature request?
Yours, Thomas.