Beliebte Suchanfragen
|
//

Useful JVM Flags – Part 1 (JVM Types and Compiler Modes)

8.3.2011 | 6 minutes of reading time

Modern JVMs do an amazing job at running Java applications (and those of other compatible languages) in an efficient and stable manner. Adaptive memory management, garbage collection, just-in-time compilation, dynamic classloading, lock optimization – just to cite some of the magic that happens between the scenes but rarely affects the average programmer directly. At run time, the JVM optimizes the way it handles the application or parts of it, based on continuous measurements and profiling.

Even with such a level of automation (or rather, because of so much automation, as one might argue), it is important that the JVM still provides adequate facilities for external monitoring and manual tuning. In case of errors or low performance, it must be possible for experts to intervene. Incidentally, aside from all the magic that happens under the hood, a wide range of manual tuning knobs is one of the strengths of modern JVMs as well. Of particular interest are the command line flags that can be passed to the JVM at startup. Some JVMs provide several hundreds of these flags, and it is easy to get lost without proper knowledge in this area. The goal of this blog series is to highlight the most relevant flags for everyday use and explain what they are good for. We will focus on the Sun/Oracle HotSpot JVM as of Java 6, though in most cases similar flags exist for the other popular JVMs.

-server and -client

There are two types of the HotSpot JVM, namely “server” and “client”. The server VM uses a larger default size for the heap, a parallel garbage collector, and optimizes code more aggressively at run time. The client VM is more conservative, resulting in shorter startup time and lower memory footprint. Thanks to a concept called “JVM ergonomics”, the type of JVM is chosen automatically at JVM startup time based on certain criteria regarding the available hardware and operating system. The exact criteria can be found here . From the criteria table, we also see that the client VM is only available on 32-bit systems.

If we are not happy with the pre-selected JVM, we can use the flags -server and -client to prescribe the usage of the server and client VM, respectively. Even though the server VM was originally targeted at long-running server processes, nowadays it often shows superior performance than the client VM in many standalone applications as well. My recommendation is to choose the server VM by setting the -server flag whenever performance in the sense of shorter execution time is important for an application. A common gotcha: On 32-bit systems, a HotSpot JDK is required to be able to run the server VM at all – the 32-bit JRE only ships with the client VM.

-version and -showversion

How do we know which Java installation and which JVM type is used when we call java? With more than one Java installation on a system there is always a slight risk of running the wrong JVM without noticing it. Especially popular in this respect are pre-installed JVMs on various Linux distributions, even though I have to admit that things have become better over the years.

Luckily, we have the -version flag available, which prints some information about the used JVM to stdout. One example:

1$ java -version
2java version "1.6.0_24"
3Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
4Java HotSpot(TM) Client VM (build 19.1-b02, mixed mode, sharing)

The output shows the Java version number (1.6.0_24) and the exact build ID of the JRE used (1.6.0_24-b07). We also see the name (HotSpot), the type (Client) and the build ID (19.1-b02) of the JVM. In addition to that, we learn that the JVM runs in mixed mode. This mode of execution is the default mode of HotSpot and means that the JVM dynamically compiles byte code into native code at run time. We also learn that class data sharing is enabled. Class data sharing is an approach that stores the system classes of the JRE in a readonly cache (in a jsa file, “Java Shared Archive”) which is used as a shared resource by the classloader of all Java processes. Class data sharing may be beneficial to performance when compared to reading all the class data from jar archives over and over again.

The -version flag terminates the JVM immediately after printing out the above data. However, there is a similar flag -showversion which can be used to produce the same output but then proceed and execute a given Java application. Thus, -showversion is a useful addition to the command line of virtually every Java application. You never know when you suddenly need some information about the JVM used by a particular (crashed) Java application. By adding -showversion on startup, we are guaranteed to have this information available whenever we may need it.

-Xint, -Xcomp, and -Xmixed

The two flags -Xint and -Xcomp are not too relevant for our everyday work, but highly interesting in order to learn something about the JVM. The -Xint flag forces the JVM to execute all bytecode in interpreted mode, which comes along with a considerable slowdown, usually factor 10 or higher. On the contrary, the flag -Xcomp forces exactly the opposite behavior, that is, the JVM compiles all bytecode into native code on first use, thereby applying maximum optimization level. This sounds nice, because it completely avoids the slow interpreter. However, many applications will also suffer at least a bit from the use of -Xcomp, even if the drop in performance is not comparable with the one resulting from -Xint. The reason is that by setting -Xcomp we prevent the JVM from making use of its JIT compiler to full effect. The JIT compiler creates method usage profiles at run time and then optimizes single methods (or parts of them) step by step, and sometimes speculatively, to the actual application behavior. Some of these optimization techniques, e.g., optimistic branch prediction, cannot be applied effectively without first profiling the application. Another aspect is that methods are only getting compiled at all when they prove themselves relevant, i.e., constitute some kind of hot spot in the application. Methods that are called rarely (or even only once) are continued to be executed in interpreted mode, thus saving the compilation and optimization cost.

Note that mixed mode also has its own flag, -Xmixed. With recent versions of HotSpot, mixed mode is the default, so we don’t have to specify this flag anymore.

Let us consider the results of a simple example benchmark which fills a HashMap with objects and then retrieves them again. For each benchmark, the execution time shown is the average time over a large number of runs.

1$ java -server -showversion Benchmark
2java version "1.6.0_24"
3Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
4Java HotSpot(TM) Server VM (build 19.1-b02, mixed mode)
5 
6Average time: 0.856449 seconds
1$ java -server -showversion -Xcomp Benchmark
2java version "1.6.0_24"
3Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
4Java HotSpot(TM) Server VM (build 19.1-b02, compiled mode)
5 
6Average time: 0.950892 seconds
1$ java -server -showversion -Xint Benchmark
2java version "1.6.0_24"
3Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
4Java HotSpot(TM) Server VM (build 19.1-b02, interpreted mode)
5 
6Average time: 7.622285 seconds

Of course there are also benchmarks that show -Xcomp to be best. Still, and especially for long running applications, I would strongly advise everyone to stick to the JVM default settings and let the JIT compiler make full use of its dynamic potential. After all, the JIT compiler is one of the most sophisticated components of the JVM – in fact, the recent advances in this area are the major reason why Java is not considered slow anymore nowadays.

|

share post

Likes

2

//

More articles in this subject area

Discover exciting further topics and let the codecentric world inspire you.

//

Gemeinsam bessere Projekte umsetzen.

Wir helfen deinem Unternehmen.

Du stehst vor einer großen IT-Herausforderung? Wir sorgen für eine maßgeschneiderte Unterstützung. Informiere dich jetzt.

Hilf uns, noch besser zu werden.

Wir sind immer auf der Suche nach neuen Talenten. Auch für dich ist die passende Stelle dabei.