I have a 6.5GB Hprof file that was dumped by a 64-bit JVM using the -XX:-HeapDumpOnOutOfMemoryError
option. I have it sitting on a 16GB 64-bit machine, and am trying to get it into jhat, but it keeps running out of memory. I have tried passing in jvm args for minimum settings, but it rejects any minimum, and seems to run out of memory before hitting the maximum.
It seems kind of silly that a jvm running out of memory dumps a heap so large that it can't be loaded on a box with twice as much ram. Are there any ways of getting this running, or possibly amortizing the analysis?
Use the equivalent of jhat -J-d64 -J-mx16g myheap.hprof
as a command to launch jhat, i.e., this will start jhat in 64-bit mode with a maximum heap size of 16 gigabytes.
If the JVM on your platform defaults to 64-bit-mode operation, then the -J-d64
option should be unnecessary.