Java application OutOfMemoryError PermGen space

Solution Verified - Updated

Environment

  • OpenJDK
  • Sun JDK
  • Sun JRE

Issue

  • OutOfMemoryError: PermGen space
  • Java application hung due to following error :
ERROR [org.apache.catalina.connector.CoyoteAdapter] An exception or error occurred in the container during the request processing
java.lang.OutOfMemoryError: PermGen space
	at sun.misc.Unsafe.defineClass(Native Method)
	at sun.reflect.ClassDefiner.defineClass(ClassDefiner.java:45)
	at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:381)
	at java.security.AccessController.doPrivileged(Native Method)
	at sun.reflect.MethodAccessorGenerator.generate(MethodAccessorGenerator.java:377)
	at sun.reflect.MethodAccessorGenerator.generateConstructor(MethodAccessorGenerator.java:76)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:30)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.jboss.web.tomcat.security.SecurityAssociationActions$5.run(SecurityAssociationActions.java:248)
	at org.jboss.web.tomcat.security.SecurityAssociationActions$5.run(SecurityAssociationActions.java:244)
	at java.security.AccessController.doPrivileged(Native Method)
	at org.jboss.web.tomcat.security.SecurityAssociationActions.createSecurityContext(SecurityAssociationActions.java:243)
	at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.createSecurityContext(SecurityContextEstablishmentValve.java:78)
	at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:110)
	at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
	at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
	at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330)
	at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829)
	at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:598)
	at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)
	at java.lang.Thread.run(Thread.java:619)

Resolution

  • See jvmconfig, a Red Hat Customer Portal Labs app, for an interactive way to generate an optimized configuration for required environment.

  • If the permanent generation is too small, increase its size. For example:

      -XX:PermSize=256M -XX:MaxPermSize=256M
    

    For EAP 5 and earlier, that would likely be adjusted in the run.conf file or standalone.conf for EAP 6 standalone mode. For EAP 6 domain mode, The jvm settings may be set in either the domain.xml or host.xml, for example:

      <jvm name="default">
          <heap size="1303m" max-size="1303m"/>
          <permgen max-size="512m"/>
      </jvm>
    
  • If setting JAVA_OPTS outside of run.conf.bat to cause this issue, then comment/remove the following line from run.conf.bat so that it's JAVA_OPTS are not skipped in that case:

      if not "x%JAVA_OPTS%" == "x" goto JAVA_OPTS_SET
    

    Change the following line:

      set "JAVA_OPTS=-Xms1303m -Xmx1303m -XX:MaxPermSize=256m -Dorg.jboss.resolver.warning=true -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000 -Dsun.lang.ClassLoader.allowArraySyntax=true"
    

    to:

      set "JAVA_OPTS=%JAVA_OPTS% -Xms1303m -Xmx1303m -XX:MaxPermSize=256m -Dorg.jboss.resolver.warning=true -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000 -Dsun.lang.ClassLoader.allowArraySyntax=true"
    

    so that it does not override the previously set JAVA_OPTS.

  • If the issue is related to hot deployment/redeployment, consider changing practices to avoid hot deployment/redeployment. If hot deployment/redeployment is unavoidable, schedule a  subsequent restart of the application during off hours to release any leaked classloaders.

  • If the issue is dynamic proxies, change the code to decrease the use of dynamic proxies. For example:

    1. Cache the proxy so that it's only created once and reused for future calls.
    2. Do not use a proxy (e.g. make direct JMX calls).

Root Cause

OpenJDK and the Sun JDK have a dedicated permanent generation (perm gen) space. The error indicates the size of the data stored in this space exceeds the allocated size. This could be due to the permanent generation being sized too small or unintended retention. The following are all known causes:

Diagnostic Steps

  • Check the VM arguments (e.g. JBoss boot.log starting EAP 4.2 CP07 and 4.3 CP06 or later) to verify the desired PermGen settings are being properly found and applied.
  • Test setting the permanent generation size to 128M, 256M, 512M, or even higher to see if the issue goes away. For example:
    -XX:PermSize=256M -XX:MaxPermSize=256M
    
  • Are any applications being hot deployed or redeployed?
  • Enable garbage collection logging (How do I enable Java garbage collection logging?) and analyze it (How do I analyze Java garbage collection logging?).
    • Verify the permanent generation space is getting filled up. This should lead to repeated full garbage collections trying to free space prior to the OutOfMemoryError.
  • Use jmap with OpenJDK and Sun JDK 1.5 and later (JDK 1.6 for Linux Itanium and Windows) to determine what classes are consuming permanent generation space. Output will show class loader, # of classes, bytes, parent loader, alive/dead, type, and totals. For example:
    jmap -permstat JBOSS_PID  >& permstat.out
    
  • Once the permstat information has been obtained examine the largest offenders, for instance:
    cat permstat.out | sort -k3,3 -n
    
  • You can also review the permstat information to determine overall utilization of String.intern() by looking at the intern occupancy statement, for instance the following shows 329M of interned Strings:
    3504167 intern Strings occupying 329630224 bytes.
    
  • Additionally, if a specific class is seen then the total amount of memory used by this can be summed with the followowing command:
    grep $CLASSNAME permstat.out | awk '{sum += $3} END {print sum}'
    
  • Get a heap dump (How do I create a Java heap dump?) and analyze it (How do I analyze a Java heap dump?):
    • A heap dump does not contain direct permanent generation retention data; however, it does provide useful troubleshooting information to identify classloader leaks.
    • The Eclipse Memory Analyzer Tool has a Classloader Explorer view that shows defined classes and number of instances per classloader. An idea of the amount of permanent generation space a classloader uses can be determined based on the number of associated classes and instances.
    • Check the Classloader Explorer for classloaders that are unexpectedly kept alive and use the Duplicate Classes function.
    • Use the following Object Query Language (OQL) queries to find duplicate deployments (e.g. duplicate application war/ear/jars):
      SELECT toString(oname._canonicalName) FROM org.jboss.web.tomcat.service.WebAppLoader
      SELECT toString(origURL.path) FROM org.jboss.mx.loading.UnifiedClassLoader3
      #EAP5
      SELECT toString(policy.name) FROM org.jboss.classloader.spi.base.BaseClassLoader 
      #EAP6
      SELECT module.identifier.name.value.toString() FROM org.jboss.modules.ModuleClassLoader
      
    • Search for java.lang.reflect.Proxy in the histogram view and select the result and look at the value of the nextUniqueNumber field. This will indicate how many dynamic proxies have been created.
  • If running JBoss on Windows under a service wrapper, start JBoss directly with run.bat to rule out any service wrapper issues.
Components
Category

This solution is part of Red Hat’s fast-track publication program, providing a huge library of solutions that Red Hat engineers have created while supporting our customers. To give you the knowledge you need the instant it becomes available, these articles may be presented in a raw and unedited form.