Java application high CPU
Environment
- Java
- Red Hat Enterprise Linux (RHEL) 4
- Red Hat Enterprise Linux (RHEL) 5
- Red Hat Enterprise Linux (RHEL) 6
- Red Hat Enterprise Linux (RHEL) 7
- Red Hat Enterprise Linux (RHEL) 8
Issue
- Java application using a large percentage of CPU
- Java application server (JBoss EAP, Tomcat, Jetty) high CPU utilization
- Java Application consuming 100% of CPU
Resolution
- Garbage collection tuning
- Increase hardware (CPU or physical memory)
- Resolve threading issues
- Refactor application to limit memory or CPU usage
Root Cause
The following is a list of known issues:
- High CPU due to multiple Java threads accessing HashMap simultaneously
- High CPU due to multiple Java threads accessing TreeMap simultaneously
- Concurrent access to WeakHashMap in ConcurrentCache causes infinite loop and 100% CPU in JBoss
- High CPU due to Richfaces HashMap.get() infinite loop
- JBoss profiling shows CPU increases using Sun JDK 1.6 vs. JRockit JDK 1.6
- High CPU and concurrency issues running Seam hot deploy/debug mode in production
- Idle QuartzSchedulerThreads are consuming too much CPU
- High CPU due to RMI TCP Connection thread infinite loop on HashMap
- High CPU due to RichFaces concurrent access to the request attributes HashMap
- High CPU or many blocked threads in BaseClassLoader.getResourcesLocally
- High CPU due to rapid looping in TransactionReaper.check
- High CPU Utilization due to GZIP compression issues at the CXF level
- High CPU due to threads looping in OutputBuffer.realWriteChars
- Why is there high CPU usage after inserting the leap second?
- High CPU and heap usage in JBoss Session Replication
- High CPU in java.util.regex.Matcher
- Java application periodic high latency / processing times due to NUMA page reclaim on RHEL
- High CPU due to unsafe HashMap access in TomcatInjectionContainer.getEncInjectionsForClass
- Threads looping in WeakHashMap.get called from com.sun.facelets.tag.MetaRulesetImpl.getMetadataTarget on JBoss
- High CPU in NIO Selector
- High CPU due to heavy use of String.replaceAll
- High CPU in ScheduledThreadPoolExecutors after they are shutdown
- High CPU utilization for Java application running on VMWare ESX
- High CPU usage caused by JSF components not bound to request scope in EAP 6
- High CPU: Competing Java parallel garbage collection threads
- Java high CPU due to concurrent HashMap access in Apache Axis TypeMappingImpl.internalRegister
- High CPU and GC spikes from JDOM TextBuffer.append Calls
- High CPU in concurrent access to the JSSESupport keySizeCache map
- JBoss EAP 6.x server high response times and high CPU with many HttpManagementService threads
- High cpu in beanutils WeakHashMap
- High CPU in JSF ViewScopeManager.destroyBeans
- High CPU in org.jboss.sun.net.httpserver.AuthFilter.consumeInput and BoundaryDelimitedInputStream.fullRead
- JAX-WS client invocation time keeps increasing over life of JVM in EAP 6
- High CPU in TempFileProviderService thread
- High CPU and heap overhead in JMX when many rars are deployed
- High CPU in sun.net.www.MeteredStream.skip
- High CPU in org.apache.axis.utils.JavaUtils.isEnumClass
- JBoss threads looping in KeyAffinityServiceImpl.getKeyForAddress
- High CPU in "VM Thread" in Java application
- High CPU in JVM compiler threads
- High CPU in sun.nio.ch.EPollArrayWrapper.epollWait
- High CPU usage in JDG 6.1
- High CPU Usage By Infinispan In EAP 7
- High CPU in InternalNioOutputBuffer.sendAck
- High CPU in InternalInputBuffer.nextRequest after updating to 6.4.13
- High CPU in ContextConfig.processAnnotationsFile during Tomcat startup
- High CPU in XNIO accept thread
- Poor performace and sporadic high CPU usage of Java 7 application due to compiler cache exhasution
- ajp-apr-
-Poller thread consumes high cpu in Tomcat 8 - Intermittent problems after updating to EAP 6.4 CP17
- High CPU in XNIO code after updating to EAP 6.4 CP17
- Degraded performance in EAP 7.1 from ImportedClassELResolver
- EAP 7 performance issues with StuckThreadDetectionHandler enabled
- High CPU in Channels.writeBlocking
- Logic performing several hundred queries takes significantly more time in Hibernate 4
- High CPU overhead in JMX operations when many rolled log files exist
- High CPU in selector polls on EAP 7 with HTTP/2 enabled
- High CPU in selector polls on EAP 7
- High CPU from StandardContext.backgroundProcess & WebappLoader.modified in ContainerBackgroundProcessor thread
- High CPU load and slowness in GCM cipher encryption
- Logger Lookup Much Slower and High CPU on JDK11
- This content is not included.High CPU on EAP 7 in EJBClientInvocationContext.awaitResponse
- Eclipse Microprofile metrics causing slow startup and high cpu on EAP
- Tomcat experiences high CPU in OpenSSLEngine.unwrap
- Syscall to futex causes 100% cpu utilization by Java application
- Remote EJB client hangs with high CPU after several hours and large number of invocations in EAP 7
- EAP 7 has high CPU in KeyAffinityServiceImpl
- High cpu due to continuous G1 concurrent cycles initiated by humongous allocations
- High CPU in io.undertow.protocols.ssl.SslConduit.wrapAndFlip
- I/O threads experience increased CPU usage for https-listener when task worker thread pool is exhausted in JBoss EAP 7
- High CPU in AbstractClassLoaderValue.computeIfAbsent calls
- High CPU in ScheduledThreadPoolExecutors with 0 corePoolSize
- JBoss high CPU in io.undertow.servlet.spec.ServletPrintWriter.close
- EAP 7.4.9+ degrades performance with increased CPU in statement overhead
Diagnostic Steps
High CPU utilization is a common issue and can be caused by many things: excessive garbage collection, swapping, CPU intensive operations, threading issues, etc.
Data Collection
The data required for analysis is well known. To determine the cause of the high CPU, gather all the following data from the period of high CPU:
- Java garbage collection logging
It is a best practice to enable standard Java garbage collection logging, so this data should always be available. See: How do I enable Java garbage collection logging?.
- OS thread CPU usage
- Java thread dumps
- How do I identify high CPU utilization by Java threads on Linux/Solaris
- How do I identify high CPU utilization by Java threads on Windows
- Application logging (if the Java application is an application server):
- JBoss EAP 4 / 5:
$JBOSS_HOME/server/$PROFILE/log/server.log$JBOSS_HOME/server/$PROFILE/log/boot.log
- JBoss EAP 6 standalone mode:
$JBOSS_HOME/standalone/log/server.log
- JBoss EAP 6 domain mode:
$JBOSS_HOME/domain/log/host-controller.log$JBOSS_HOME/domain/log/domain-controller.log$JBOSS_HOME/domain/servers/$SERVER/log/server.log
- Tomcat:
catalina.out
Analysis
- Garbage Collection
The most common cause for high CPU is excessive gc (garbage collection). Excessive garbage collection can also cause threading issues. Garbage collection issues must be addressed before doing further analysis. The should be very straightforward, as having standard garbage collection logging enabled is a best practice, and there is good tooling available to do the analysis.
-
For general garbage collection analysis, see How do I analyze Java garbage collection logging?.
-
If you see long pause times and/or low throughput overall, see Java garbage collection long pause times.
-
To see garbage collection activity at the time of the issue, analyze the gc logging around the time of the first timestamp in the high CPU data gathered. If the garbage collection logging does not have a timestamp but just the number of seconds after JVM startup, use the first timestamp in the console log,
boot.log(JBoss),server.log(JBoss), orcatalina.out(Tomcat) and calculate the number of seconds to the first timestamp in the high CPU data gathered.
- Thread(s) Consuming CPU
If Garbage collection is not responsible for the high CPU, identify thread(s) consuming CPU:
- How do I identify high CPU utilization by Java threads on Linux/Solaris
- How do I identify high CPU utilization by Java threads on Windows
- Application Logging
The application logging may provide additional clues and is collected for completeness.
Check the application logging for OutOfMemory errors.
For analyzing JBoss server logs consider the Access Labs app Log Reaper.
This solution is part of Red Hat’s fast-track publication program, providing a huge library of solutions that Red Hat engineers have created while supporting our customers. To give you the knowledge you need the instant it becomes available, these articles may be presented in a raw and unedited form.