The JVM struggles with the MAT tools to analyze OOM problems

Let’s just show analysis and fight OOM.

direct code:

public class Demo4 {

    public static void main(String[] args) {
        List<Dandan> list = new ArrayList<>();
        while (true){
            list.add(new Dandan());
        }
    }
}

class Dandan{

}

JVM parameter:

-XX: + UseParNewGC -XX: + UseConcMarkSweepGC -Xms10m -Xmx10m -XX: + PrintGCDetails -Xloggc: gc_dandan. log -XX: + HeapDumpOnOutOfMemoryError -XX: HeapDumpPath=. /

Log after launch:

java.lang.OutOfMemoryError: Java heap space
Dumping heap to ./java_pid22788.hprof ...
Heap dump file created [13244840 bytes in 0.050 secs]
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.util.Arrays.copyOf(Arrays.java:3210)
	at java.util.Arrays.copyOf(Arrays.java:3181)
	at java.util.ArrayList.grow(ArrayList.java:267)
	at java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:241)
	at java.util.ArrayList.ensureCapacityInternal(ArrayList.java:233)
	at java.util.ArrayList.add(ArrayList.java:464)
	at com.hailintang.demo.jdk8.gc.oom.Demo4.main(Demo4.java:11)

Process finished with exit code 1

java.lang.OutOfMemoryError: Java heap space

First of all, here clearly tells us where the memory leak is: heap

Because the options are configured:-XX: + HeapDumpOnOutOfMemoryError

When OOM occurs in the program, the file is created automatically:java_pid22788. hprof

Name format: java_pid (program procedure number).hprof

Then, of course, I used Mat to parse: hprof

What do we focus on before analysis?

  • First, what things take up a lot of memory.
  • Second, which is what the subject cites. (Just want to know why that means it can’t be released)
  • Third, put the specific line code to analyze the problem

Open the HPROF file and if you want to analyze memory leaks, check the red box.

de28844216114e9df5d2995d021fdc66

Here, let’s first look at these memory-intensive things. Click the “Red Box Histogram” button

f2c14e457b49a6ed64139afd54fa523c

First, what things take up a lot of memory

Enter the chart page

92de7d5072224f7a45bdcba8cc86deb8

After entering the page, you can find at a glance which object occupies the most memory.

For example, here is the explicit class com.hailintang.demo.jdk8.gc.oom.dandan. This class consumes a lot of memory.

There are 360,146 of these crooning class pieces here. Until now, we temporarily determined that the Dandan object was taking up too much memory.

Second, quote by object

Next, let’s see who quotes the hyper memory object.

Next, you need to use the mate-dominator tree: This is a tool used to analyze the relationship between objects.

d8440feee2019143ec1b54aa3174fc13

You can then see which threads are creating too many objects.

For example, a lot of objects streamed in heremain thread

cb949f1d8453ec0a14695d51e7291338

Then you start this main thread to see which object was created

The main branch takes up a lot of memory

80f6ea7508b42bfe4e376158e67663b1

Click on it, good man

It is found to be an array of java.lang.object [].

Complete

The Matrix is ​​found to be all Dandan Objects

Speaking of which, the truth is very white

Expand layer by layer. An object created by the main thread and Dandan that takes up too much memory at the edge of the graph.

Third, put the specific line code to analyze the problem

After finding the link, the last step is to place many items to create many objects?

Currently you need a different think_overview. appear in a red circle

THITE_OVERVIEW function: Shows all JVM threads, each thread and a thread method at the time the stack is called, and what objects are created by each method.

Special explanation:Stocks, go first, so when you’re looking at a photo, look at it from the bottom up

1f527f0df552d3823aa1287aaa7f4b8c

Enter the interface THITE_OVERVIEW

76669ba991319d06c20eb820e56a2f9d

Find the main stream, click to watch it

84ef13bc73822814a84cac63162f057a

You can quickly see Demo4.java:11

At present, you are not particularly sure. You need to open up and keep seeing
1a81385516175a5d998ace5e13bc25c5

You can see that there are actually a large number of Dandan objects created here.

At this point you serialize the big meal code:

7c62d76ec0199f9f2ca370c228981059

Well, here is basically the OOM problem that specific code has been found to cause.

Just summarizing

summarize

Our methodology:

  • First, what things take up a lot of memory. —— Corresponding graph
  • Second, which is what the subject cites. (Just need to know why it can’t be released) —— Checkmate for the dominator in a row
  • Third, put in specific code to analyze the problem – appropriately

Then, according to the methodology, in combination with the MAT tool, we check each step. The goal is to find the OOM problem caused by the line code.

The above is a relatively simple OOM problem. This is just a demonstration, and its purpose is to tell you how to use Mat to analyze an OOM problem.

If the item is more complicated, in the third step, if the problem code is OOM due to other middleware such as Tomcat, Jetty, RPC and other frameworks. At this time, you also need to be familiar with the middleware of this site to post questions. If the problem code is working code, the development engineer responsible for the project is currently required to find the code.

In fact, anyone can use the MAT OOM situation methodology for actual work use.

Well, that’s it for today’s technical exchange.

If you have any questions, please leave a message.

Leave a Comment