Solution to high CPU usage of Tomcat process

Solution to high CPU usage of Tomcat process

The CPU often becomes the bottleneck of system performance, which may be:

  • Memory leaks lead to frequent GC, which in turn causes high CPU usage
  • The code bug creates a large number of threads, causing frequent CPU context switching.

When we say that CPU usage is too high, it implies a benchmark value for comparison, such as

  • Average CPU utilization of JVM at peak load 40%
  • CPU usage reaching 80% is considered abnormal

A JVM process consists of multiple Java threads:

  • Some are waiting for work
  • Others are on mission

The most important thing is to find out which threads are consuming CPU and locate the problem code through the thread stack. If you do not find that the CPU usage of individual threads is particularly high, consider whether the thread context switching causes the high CPU usage.

Case

Program simulates high CPU usage - creates 4096 threads in the thread pool

Start the program in Linux environment:

java -Xss256k -jar demo-0.0.1-SNAPSHOT.jar

The thread stack size is specified as 256KB. For the test program, the operating system default value of 8192KB is too large because 4096 threads need to be created.

Using the top command, we see that the Java process is using 961.6% of the CPU and notice that the process ID is 55790.

Use the more refined top command to view the CPU usage of each thread in this Java process:

#top -H -p 55790 

It can be seen that a thread called "scheduling-1" occupies a large amount of CPU, reaching 42.5%. So the next step is to find out what this thread is doing.

To find out what the threads are doing, generate a thread snapshot using jstack.
The jstack output is large and is usually written to a file:

jstack 55790 > 55790.log

Open 55790.log and locate the thread named scheduling-1 found in step 4. Its thread stack is:

Seeing the function call of AbstractExecutorService#submit, it means that it is a periodic task thread started by Spring Boot, which submits tasks to the thread pool and consumes a lot of CPU.

Context switching overhead?

After going through the above process, it is often possible to locate threads that consume a lot of CPU and bug codes, such as dead loops. But for this case: the Java process occupies 961.6% of the CPU, while the "scheduling-1" thread only occupies 42.5% of the CPU. Then who occupies the rest of the CPU?

In step 4, there are many threads named "pool-1-thread-x" in the thread list viewed by the top -H -p pid command. Their individual CPU usage is not high, but the number seems to be relatively large. As you might have guessed, these are the threads that do the work in the thread pool. Is the remaining CPU consumed by these threads?

You also need to look at the output results of jstack, mainly to see whether the threads in these thread pools are really working or "resting"?

It is found that these "pool-1-thread-x" threads are basically in WAITING state.

  • Blocking refers to a state in which a thread is blocked because it is waiting for a lock (Lock or synchronized keyword) in a critical section. Please note that the thread in this state has not yet obtained the lock.
  • Waiting means that a thread has obtained the lock, but needs to wait for other threads to perform certain operations. For example, when the Object.wait, Thread.join, or LockSupport.park method is called, the Waiting state is entered. The premise is that the thread has already obtained the lock, and before entering the Waiting state, the OS level will automatically release the lock. When the waiting condition is met, the Object.notify or LockSupport.unpark method is called externally, and the thread will compete for the lock again. Only after successfully obtaining the lock can it enter the Runnable state to continue execution.

Back to our "pool-1-thread-x" threads, these threads are in the "Waiting" state. From the thread stack, we can see that these threads are "waiting" on the getTask method call. The thread tries to get a task from the queue of the thread pool, but the queue is empty, so it enters the "Waiting" state through the LockSupport.park call. How many "pool-1-thread-x" threads are there? The following command is used to count the number of threads, and the result is 4096, which is exactly equal to the number of threads in the thread pool.

grep -o 'pool-2-thread' 55790.log | wc -l

Who consumes the remaining CPU?
We should suspect the CPU context switching overhead because we see a large number of threads in the Java process.

Let's use the vmstat command to view the thread context switching activity at the operating system level:

The cs column indicates the number of thread context switches, and in indicates the number of CPU interrupts. We found that these two numbers are very high, which basically confirms our guess that thread context switches consume a lot of CPU.
Which process caused it?

Stop the Spring Boot program and run the vmstat command again. You will see that both in and cs have dropped significantly, which confirms that the Java process that causes the thread context switching overhead is 55790.

Summarize

When the CPU usage is too high, first locate the process causing it. Then, use the top -H -p pid command to locate the specific thread.
Secondly, we need to check the status of threads through jstack to see the number of threads or the status of threads. If the number of threads is too large, it can be suspected that the overhead of thread context switching is caused. We can confirm this through the two tools vmstat and pidstat.

This is the end of this article about how to solve the problem of Tomcat process occupying too much CPU. For more information about Tomcat process occupying too much CPU, please search for previous articles on 123WORDPRESS.COM or continue to browse the related articles below. I hope you will support 123WORDPRESS.COM in the future!

You may also be interested in:
  • SpringBoot starts embedded Tomcat implementation steps
  • Tomcat breaks the parent delegation mechanism to achieve isolation of Web applications
  • A brief discussion on how Tomcat breaks the parent delegation mechanism
  • Use tomcat to set shared lib to share the same jar
  • Fifteen Tomcat interview questions, a rare opportunity!

<<:  Several commonly used methods for centering CSS boxes (summary)

>>:  Simple web design concept color matching

Recommend

Vue implements the magnifying glass effect of tab switching

This article example shares the specific code of ...

Implementation code of Nginx anti-hotlink and optimization in Linux

Hide version number The version number is not hid...

CentOS IP connection network implementation process diagram

1. Log in to the system and enter the directory: ...

Velocity.js implements page scrolling switching effect

Today I will introduce a small Javascript animati...

Interpretation of Vue component registration method

Table of contents Overview 1. Global Registration...

Creative About Us Web Page Design

Unique “About”-Pages A great way to distinguish yo...

How to wrap HTML title attribute

When I was writing a program a few days ago, I wan...

CSS border adds four corners implementation code

1.html <div class="loginbody"> &l...

Summary of the use of special operators in MySql

Preface There are 4 types of operators in MySQL, ...

An article to help you learn CSS3 picture borders

Using the CSS3 border-image property, you can set...

Detailed explanation of the use of filter properties in CSS3

Recently, when I was modifying the intranet porta...

Deployment and Chinese translation of the docker visualization tool Portainer

#docker search #docker pull portainer 1. Download...

Using Apache ab to perform http performance testing

Mac comes with Apache environment Open Terminal a...

Talk about implicit conversion in MySQL

In the course of work, you will encounter many ca...