Tuesday, September 16, 2008

Utilizing 64bit JVMs in Java CAPS Integration Server

A very common error that is observed with Java CAPS Integration Server is "Out of Memory". When this happens we observe the following:

  • we can not deploy any applications further to the Application Server
  • slowdown of Integration Server

We can analyze the error at the application level using profilers or other tools like 'jhat'. But at the same time we might need to have higher memory needs depending upon the application.

CPU architecture has limitation on the JVM heap size that can be allocated for a process. Some operating systems reserve portions of process address space for OS use, effectively reducing the total address space available for mapping memory for user programs. For instance, Windows XP DLLs and userland OS components are mapped into each process's address space, leaving only 2 to 3.8 GB (depending on the settings) address space available, even if the computer has 4 GiB of RAM. This restriction is not present in 64-bit Windows.

The Java HotSpot FAQ says "The maximum theoretical heap limit for the 32-bit JVM is 4G. Due to various additional constraints such as available swap, kernel address space usage, memory fragmentation, and VM overhead, in practice the limit can be much lower. On most modern 32-bit Windows systems the maximum heap size will range from 1.4G to 1.6G. On 32-bit Solaris kernels the address space is limited to 2G. On 64-bit operating systems running the 32-bit VM, the max heap size can be higher, approaching 4G on many Solaris systems. ... If your application requires a very large heap you should use a 64-bit VM on a version of the operating system that supports 64-bit applications."

It further states "Generally, the benefits of being able to address larger amounts of memory come with a small performance loss in 64-bit VMs versus running the same application on a 32-bit VM. This is due to the fact that every native pointer in the system takes up 8 bytes instead of 4. The loading of this extra data has an impact on memory usage which translates to slightly slower execution depending on how many pointers get loaded during the execution of your Java program. The good news is that with AMD64 and EM64T platforms running in 64-bit mode, the Java VM gets some additional registers which it can use to generate more efficient native instruction sequences. These extra registers increase performance to the point where there is often no performance loss at all when comparing 32 to 64-bit execution speed. The performance difference comparing an application running on a 64-bit platform versus a 32-bit platform on SPARC is on the order of 10-20% degradation when you move to a 64-bit VM. On AMD64 and EM64T platforms this difference ranges from 0-15% depending on the amount of pointer accessing your application performs."

If we are deploying the Java CAPS applications on 64 bit operating systems which are having large memory we should leverage the availability of large RAM on such platforms and high end server machines. So we should able to start the Logical host or the application server with the following JVM options such that the server starts with Large heap.

We should able to modify the <jvm-options> section of the file <logicalhost>/is/domains/<domain_name>/config/domain.xml:

  • Add <jvm-options>-d64</jvm-options> at the start of the <jvm-options> section -- If neither -d32 nor -d64 is specified, the default is to run in a 32-bit environment.
  • Add either <jvm-options>-XX:+UseParallelGC</jvm-options> or <jvm-options-XX:+UseConcMarkSweepGC</jvm-options> depending upon the nature of the application(s) that are being deployed on the Integration Server. For a background ref to. GC Tuning.
  • Modify the following options in the section to appropriate values.
    • <jvm-options>-Xmx16384m</jvm-options>
    • <jvm-options>-XX:MaxPermSize=2048m</jvm-options>
    • The options I gave above are for 8 core , 64 GB RAM machine
  • The option <jvm-options>-server</jvm-options> may be removed since -server option is implicit with the use of -d64

Common Reasons for Out Of Memory Errors

Out Of Memory error can happen due to following reasons:

  • Garbage Collection Issues
  • Orphaned Class loaders
    • Thread context classloader
    • new Thread()
    • Dangling thread
  • Classes with following references
    • static variables
    • SQL Driver
    • Commons logging
    • java.util.logging.Level
    • Bean util
      • Details

etc..

We can analyze the problem(s) associated with such errors at application level by having GC details and/or heap dump

  • -verbose:gc with -XX:+PrintGCDetails for observing GC model for tuning or altering suitable GC for the application
  • -XX:+HeapDumpOnOutOfMemoryError
  • -Xrunhprof:heap=dump,format=b
  • jmap -dump:format=b,file=heap.bin <pid>

Analyze heap dump using Jhat

  • jhat -J-mx1024m heap.bin
  • http://localhost:7000
  • Using built-in or custom queries to narrow down leak suspects
  • Identify an object or class in the application
  • List reference chains