What impact, if any, does the -d64 switch have on Sun JVM resident memory usage?

Stu Thompson picture Stu Thompson · Sep 18, 2009 · Viewed 7.6k times · Source

I've got this webapp that needs some memory tuning. While I'm already profiling the application itself and trimming things down, the JVM itself seems overly bloated to me on our busiest instance. (The lower volume instances do not have this problem.) The details:

  • Platform:
    • RHEL4 64-bit (Linux 2.6.9-78.0.5.ELsmp #1 SMP x86_64)
    • Sun Java 6 (Java HotSpot(TM) 64-Bit Server VM (build 10.0-b23, mixed mode))
    • Tomcat 6 with -d64 in startup.sh
  • My webapp currently has some code that in production requires the benefits of running 64-bit.
  • I've observed that after some time (a week) the JVMs resident memory size (as shown by top) is three times the size of my -Xmx setting.
  • The non-heap memory size, etc are all relatively trivial, a mere single digit percentage of the heap size
  • There is only one section of code that requires a 64-bit bit address space

If I could refactor out the need for a 64-bit JVM, and drop the -d64 switch, would that make the JVM's resident memory footprint smaller? In other words...

What impact, if any, does the -d64 switch have on the Sun JVM resident memory usage?

Answer

Vineet Reynolds picture Vineet Reynolds · Sep 18, 2009

Usage of the d64 switch gets the JVM into the 64-bit mode. Technically, on Solaris/Linux and most Unixes, the JVM process will execute in the LP64 model.

The LP64 model is different from the 32-bit model (ILP32) in that pointers happen to be 64 bit wide as opposed to 32 bit pointers. For the JVM, this allows for greater memory addressability, but it also means that the size occupied by the object references alone has doubled. So there is greater bloat for the same number of objects at a given time in a 32-bit JVM and a 64-bit one.

Another thing that is often forgotten is the size of the instructions themselves. On a 64-bit JVM, the size of the instructions will occupy native machine register size.

If however, you use compressed object pointers in a 64-bit environment, the JVM will encode and decode pointers whenever possible for heap sizes greater than 4 GB. Briefly stated, when you use compressed pointers, the JVM attempts to use 32-bit wide values as much as possible.

Hint: Switch on the UseCompressedOops flag, using -XX:+UseCompressedOops to get rid of some of the bloat. YMMV, but people have reported upto 50% drop in memory bloat by using compressed oops.

EDIT

The UseCompressedOops flag is supported in version 14.0 of the Java HotSpot VM, available from Java 6 Update 14 onwards.