Java application high CPU

Solution Verified - Updated

Environment

  • Java
  • Red Hat Enterprise Linux (RHEL) 4
  • Red Hat Enterprise Linux (RHEL) 5
  • Red Hat Enterprise Linux (RHEL) 6
  • Red Hat Enterprise Linux (RHEL) 7
  • Red Hat Enterprise Linux (RHEL) 8

Issue

  • Java application using a large percentage of CPU
  • Java application server (JBoss EAP, Tomcat, Jetty) high CPU utilization
  • Java Application consuming 100% of CPU

Resolution

  • Garbage collection tuning
  • Increase hardware (CPU or physical memory)
  • Resolve threading issues
  • Refactor application to limit memory or CPU usage

Root Cause

The following is a list of known issues:

Diagnostic Steps

High CPU utilization is a common issue and can be caused by many things: excessive garbage collection, swapping, CPU intensive operations, threading issues, etc.

Data Collection

The data required for analysis is well known. To determine the cause of the high CPU, gather all the following data from the period of high CPU:

  1. Java garbage collection logging

It is a best practice to enable standard Java garbage collection logging, so this data should always be available. See: How do I enable Java garbage collection logging?.

  1. OS thread CPU usage
  2. Java thread dumps
  1. Application logging (if the Java application is an application server):
  • JBoss EAP 4 / 5:
    • $JBOSS_HOME/server/$PROFILE/log/server.log
    • $JBOSS_HOME/server/$PROFILE/log/boot.log
  • JBoss EAP 6 standalone mode:
    • $JBOSS_HOME/standalone/log/server.log
  • JBoss EAP 6 domain mode:
    • $JBOSS_HOME/domain/log/host-controller.log
    • $JBOSS_HOME/domain/log/domain-controller.log
    • $JBOSS_HOME/domain/servers/$SERVER/log/server.log
  • Tomcat: catalina.out

Analysis

  1. Garbage Collection

The most common cause for high CPU is excessive gc (garbage collection). Excessive garbage collection can also cause threading issues. Garbage collection issues must be addressed before doing further analysis. The should be very straightforward, as having standard garbage collection logging enabled is a best practice, and there is good tooling available to do the analysis.

  • For general garbage collection analysis, see How do I analyze Java garbage collection logging?.

  • If you see long pause times and/or low throughput overall, see Java garbage collection long pause times.

  • To see garbage collection activity at the time of the issue, analyze the gc logging around the time of the first timestamp in the high CPU data gathered. If the garbage collection logging does not have a timestamp but just the number of seconds after JVM startup, use the first timestamp in the console log, boot.log (JBoss), server.log (JBoss), or catalina.out (Tomcat) and calculate the number of seconds to the first timestamp in the high CPU data gathered.

  1. Thread(s) Consuming CPU

If Garbage collection is not responsible for the high CPU, identify thread(s) consuming CPU:

  1. Application Logging

The application logging may provide additional clues and is collected for completeness.

Check the application logging for OutOfMemory errors.

For analyzing JBoss server logs consider the Access Labs app Log Reaper.

Components
Category

This solution is part of Red Hat’s fast-track publication program, providing a huge library of solutions that Red Hat engineers have created while supporting our customers. To give you the knowledge you need the instant it becomes available, these articles may be presented in a raw and unedited form.