1% Testing the JDK
2
3## Using "make test" (the run-test framework)
4
5This new way of running tests is developer-centric. It assumes that you have
6built a JDK locally and want to test it. Running common test targets is simple,
7and more complex ad-hoc combination of tests is possible. The user interface is
8forgiving, and clearly report errors it cannot resolve.
9
10The main target `test` uses the jdk-image as the tested product. There is
11also an alternate target `exploded-test` that uses the exploded image
12instead. Not all tests will run successfully on the exploded image, but using
13this target can greatly improve rebuild times for certain workflows.
14
15Previously, `make test` was used to invoke an old system for running tests, and
16`make run-test` was used for the new test framework. For backward compatibility
17with scripts and muscle memory, `run-test` (and variants like
18`exploded-run-test` or `run-test-tier1`) are kept as aliases.
19
20Some example command-lines:
21
22    $ make test-tier1
23    $ make test-jdk_lang JTREG="JOBS=8"
24    $ make test TEST=jdk_lang
25    $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1"
26    $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;VM_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"
27    $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java"
28    $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2"
29    $ make exploded-test TEST=tier2
30
31### Configuration
32
33To be able to run JTReg tests, `configure` needs to know where to find the
34JTReg test framework. If it is not picked up automatically by configure, use
35the `--with-jtreg=<path to jtreg home>` option to point to the JTReg framework.
36Note that this option should point to the JTReg home, i.e. the top directory,
37containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
38environment variable to point to the JTReg home before running `configure`.)
39
40To be able to run microbenchmarks, `configure` needs to know where to find
41the JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
42containing the core JMH and transitive dependencies. The recommended dependencies
43can be retrieved by running `sh make/devkit/createJMHBundle.sh`, after which
44`--with-jmh=build/jmh/jars` should work.
45
46## Test selection
47
48All functionality is available using the `test` make target. In this use case,
49the test or tests to be executed is controlled using the `TEST` variable. To
50speed up subsequent test runs with no source code changes, `test-only` can be
51used instead, which do not depend on the source and test image build.
52
53For some common top-level tests, direct make targets have been generated. This
54includes all JTReg test groups, the hotspot gtest, and custom tests (if
55present). This means that `make test-tier1` is equivalent to `make test
56TEST="tier1"`, but the latter is more tab-completion friendly. For more complex
57test runs, the `test TEST="x"` solution needs to be used.
58
59The test specifications given in `TEST` is parsed into fully qualified test
60descriptors, which clearly and unambigously show which tests will be run. As an
61example, `:tier1` will expand to `jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1
62jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1
63jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1`. You can
64always submit a list of fully qualified test descriptors in the `TEST` variable
65if you want to shortcut the parser.
66
67### JTReg
68
69JTReg tests can be selected either by picking a JTReg test group, or a selection
70of files or directories containing JTReg tests.
71
72JTReg test groups can be specified either without a test root, e.g. `:tier1`
73(or `tier1`, the initial colon is optional), or with, e.g. `hotspot:tier1`,
74`test/jdk:jdk_util` or `$(TOPDIR)/test/hotspot/jtreg:hotspot_all`. The test
75root can be specified either as an absolute path, or a path relative to the
76JDK top directory, or the `test` directory. For simplicity, the hotspot
77JTReg test root, which really is `hotspot/jtreg` can be abbreviated as
78just `hotspot`.
79
80When specified without a test root, all matching groups from all test roots
81will be added. Otherwise, only the group from the specified test root will be
82added.
83
84Individual JTReg tests or directories containing JTReg tests can also be
85specified, like `test/hotspot/jtreg/native_sanity/JniVersion.java` or
86`hotspot/jtreg/native_sanity`. Just like for test root selection, you can
87either specify an absolute path (which can even point to JTReg tests outside
88the source tree), or a path relative to either the JDK top directory or the
89`test` directory. `hotspot` can be used as an alias for `hotspot/jtreg` here as
90well.
91
92As long as the test groups or test paths can be uniquely resolved, you do not
93need to enter the `jtreg:` prefix. If this is not possible, or if you want to
94use a fully qualified test descriptor, add `jtreg:`, e.g.
95`jtreg:test/hotspot/jtreg/native_sanity`.
96
97### Gtest
98
99Since the Hotspot Gtest suite is so quick, the default is to run all tests.
100This is specified by just `gtest`, or as a fully qualified test descriptor
101`gtest:all`.
102
103If you want, you can single out an individual test or a group of tests, for
104instance `gtest:LogDecorations` or `gtest:LogDecorations.level_test_vm`. This
105can be particularly useful if you want to run a shaky test repeatedly.
106
107For Gtest, there is a separate test suite for each JVM variant. The JVM variant
108is defined by adding `/<variant>` to the test descriptor, e.g.
109`gtest:Log/client`. If you specify no variant, gtest will run once for each JVM
110variant present (e.g. server, client). So if you only have the server JVM
111present, then `gtest:all` will be equivalent to `gtest:all/server`.
112
113### Microbenchmarks
114
115Which microbenchmarks to run is selected using a regular expression
116following the `micro:` test descriptor, e.g., `micro:java.lang.reflect`. This
117delegates the test selection to JMH, meaning package name, class name and even
118benchmark method names can be used to select tests.
119
120Using special characters like `|` in the regular expression is possible, but
121needs to be escaped multiple times: `micro:ArrayCopy\\\\\|reflect`.
122
123### Special tests
124
125A handful of odd tests that are not covered by any other testing framework are
126accessible using the `special:` test descriptor. Currently, this includes
127`failure-handler` and `make`.
128
129  * Failure handler testing is run using `special:failure-handler` or just
130    `failure-handler` as test descriptor.
131
132  * Tests for the build system, including both makefiles and related
133    functionality, is run using `special:make` or just `make` as test
134    descriptor. This is equivalent to `special:make:all`.
135
136    A specific make test can be run by supplying it as argument, e.g.
137    `special:make:idea`. As a special syntax, this can also be expressed as
138    `make-idea`, which allows for command lines as `make test-make-idea`.
139
140## Test results and summary
141
142At the end of the test run, a summary of all tests run will be presented. This
143will have a consistent look, regardless of what test suites were used. This is
144a sample summary:
145
146    ==============================
147    Test summary
148    ==============================
149       TEST                                          TOTAL  PASS  FAIL ERROR
150    >> jtreg:jdk/test:tier1                           1867  1865     2     0 <<
151       jtreg:langtools/test:tier1                     4711  4711     0     0
152       jtreg:nashorn/test:tier1                        133   133     0     0
153    ==============================
154    TEST FAILURE
155
156Tests where the number of TOTAL tests does not equal the number of PASSed tests
157will be considered a test failure. These are marked with the `>> ... <<` marker
158for easy identification.
159
160The classification of non-passed tests differs a bit between test suites. In
161the summary, ERROR is used as a catch-all for tests that neither passed nor are
162classified as failed by the framework. This might indicate test framework
163error, timeout or other problems.
164
165In case of test failures, `make test` will exit with a non-zero exit value.
166
167All tests have their result stored in `build/$BUILD/test-results/$TEST_ID`,
168where TEST_ID is a path-safe conversion from the fully qualified test
169descriptor, e.g. for `jtreg:jdk/test:tier1` the TEST_ID is
170`jtreg_jdk_test_tier1`. This path is also printed in the log at the end of the
171test run.
172
173Additional work data is stored in `build/$BUILD/test-support/$TEST_ID`. For
174some frameworks, this directory might contain information that is useful in
175determining the cause of a failed test.
176
177## Test suite control
178
179It is possible to control various aspects of the test suites using make control
180variables.
181
182These variables use a keyword=value approach to allow multiple values to be
183set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
184concurrency level to 1 and the timeout factor to 8. This is equivalent to
185setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format means that
186the `JTREG` variable is parsed and verified for correctness, so
187`JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8` would just
188pass unnoticed.
189
190To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
191normally eats `;`, the recommended usage is to write the assignment inside
192qoutes, e.g. `JTREG="...;..."`. This will also make sure spaces are preserved,
193as in `JTREG="VM_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"`.
194
195(Other ways are possible, e.g. using backslash: `JTREG=JOBS=1\;TIMEOUT_FACTOR=8`.
196Also, as a special technique, the string `%20` will be replaced with space for
197certain options, e.g. `JTREG=VM_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug`.
198This can be useful if you have layers of scripts and have trouble getting
199proper quoting of command line arguments through.)
200
201As far as possible, the names of the keywords have been standardized between
202test suites.
203
204### General keywords (TEST_OPTS)
205
206Some keywords are valid across different test suites. If you want to run
207tests from multiple test suites, or just don't want to care which test suite specific
208control variable to use, then you can use the general TEST_OPTS control variable.
209
210There are also some keywords that applies globally to the test runner system,
211not to any specific test suites. These are also available as TEST_OPTS keywords.
212
213#### JOBS
214
215Currently only applies to JTReg.
216
217#### TIMEOUT_FACTOR
218
219Currently only applies to JTReg.
220
221#### VM_OPTIONS
222
223Applies to JTReg, GTest and Micro.
224
225#### JAVA_OPTIONS
226
227Applies to JTReg, GTest and Micro.
228
229#### AOT_MODULES
230
231Applies to JTReg and GTest.
232
233#### JCOV
234
235This keywords applies globally to the test runner system. If set to `true`, it
236enables JCov coverage reporting for all tests run. To be useful, the JDK under
237test must be run with a JDK built with JCov instrumentation (`configure
238--with-jcov=<path to directory containing lib/jcov.jar>`, `make jcov-image`).
239
240The simplest way to run tests with JCov coverage report is to use the special
241target `jcov-test` instead of `test`, e.g. `make jcov-test TEST=jdk_lang`. This
242will make sure the JCov image is built, and that JCov reporting is enabled.
243
244The JCov report is stored in `build/$BUILD/test-results/jcov-output`.
245
246Please note that running with JCov reporting can be very memory intensive.
247
248### JTReg keywords
249
250#### JOBS
251The test concurrency (`-concurrency`).
252
253Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
254JOBS, except for Hotspot, where the default is *number of CPU cores/2* (for
255sparc, if more than 16 cpus, then *number of CPU cores/5*, otherwise *number of
256CPU cores/4*), but never more than *memory size in GB/2*.
257
258#### TIMEOUT_FACTOR
259The timeout factor (`-timeoutFactor`).
260
261Defaults to 4.
262
263#### TEST_MODE
264The test mode (`-agentvm`, `-samevm` or `-othervm`).
265
266Defaults to `-agentvm`.
267
268#### ASSERT
269Enable asserts (`-ea -esa`, or none).
270
271Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
272for hotspot.
273
274#### VERBOSE
275The verbosity level (`-verbose`).
276
277Defaults to `fail,error,summary`.
278
279#### RETAIN
280What test data to retain (`-retain`).
281
282Defaults to `fail,error`.
283
284#### MAX_MEM
285Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
286
287Limit memory consumption for JTReg test framework and VM under test. Set to 0
288to disable the limits.
289
290Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
291
292#### KEYWORDS
293
294JTReg kewords sent to JTReg using `-k`. Please be careful in making sure that
295spaces and special characters (like `!`) are properly quoted. To avoid some
296issues, the special value `%20` can be used instead of space.
297
298#### EXTRA_PROBLEM_LISTS
299
300Use additional problem lists file or files, in addition to the default
301ProblemList.txt located at the JTReg test roots.
302
303If multiple file names are specified, they should be separated by space (or, to
304help avoid quoting issues, the special value `%20`).
305
306The file names should be either absolute, or relative to the JTReg test root of
307the tests to be run.
308
309#### RUN_PROBLEM_LISTS
310
311Use the problem lists to select tests instead of excluding them.
312
313Set to `true` or `false`.
314If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used.
315Default is `false`.
316
317
318#### OPTIONS
319Additional options to the JTReg test framework.
320
321Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
322
323#### JAVA_OPTIONS
324Additional Java options to JTReg (`-javaoption`).
325
326#### VM_OPTIONS
327Additional VM options to JTReg (`-vmoption`).
328
329#### AOT_MODULES
330
331Generate AOT modules before testing for the specified module, or set of
332modules. If multiple modules are specified, they should be separated by space
333(or, to help avoid quoting issues, the special value `%20`).
334
335### Gtest keywords
336
337#### REPEAT
338The number of times to repeat the tests (`--gtest_repeat`).
339
340Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
341combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent
342problem.
343
344#### OPTIONS
345Additional options to the Gtest test framework.
346
347Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
348
349#### AOT_MODULES
350
351Generate AOT modules before testing for the specified module, or set of
352modules. If multiple modules are specified, they should be separated by space
353(or, to help avoid quoting issues, the special value `%20`).
354
355### Microbenchmark keywords
356
357#### FORK
358Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
359
360#### ITER
361Number of measurement iterations per fork. Same as specifying `-i <num>`.
362
363#### TIME
364Amount of time to spend in each measurement iteration, in seconds. Same as
365specifying `-r <num>`
366
367#### WARMUP_ITER
368Number of warmup iterations to run before the measurement phase in each fork.
369Same as specifying `-wi <num>`.
370
371#### WARMUP_TIME
372Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`.
373
374#### RESULTS_FORMAT
375Specify to have the test run save a log of the values. Accepts the same values
376as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
377
378#### VM_OPTIONS
379Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
380
381#### OPTIONS
382Additional arguments to send to JMH.
383
384## Notes for Specific Tests
385
386### Docker Tests
387
388Docker tests with default parameters may fail on systems with glibc versions not
389compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86).
390For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:
391
392    $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
393
394To run these tests correctly, additional parameters for the correct docker image are
395required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
396
397    $ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu -Djdk.test.docker.image.version=latest"
398
399### Non-US locale
400
401If your locale is non-US, some tests are likely to fail. To work around this you can
402set the locale to US. On Unix platforms simply setting `LANG="en_US"` in the
403environment before running tests should work. On Windows, setting
404`JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but not all test cases.
405For example:
406
407    $ export LANG="en_US" && make test TEST=...
408    $ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
409
410### PKCS11 Tests
411
412It is highly recommended to use the latest NSS version when running PKCS11 tests.
413Improper NSS version may lead to unexpected failures which are hard to diagnose.
414For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu
41518.04 with the default NSS version in the system.
416To run these tests correctly, the system property `test.nss.lib.paths` is required
417on Ubuntu 18.04 to specify the alternative NSS lib directories.
418For example:
419
420    $ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"
421
422For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.
423
424### Client UI Tests
425
426Some Client UI tests use key sequences which may be reserved by the operating
427system. Usually that causes the test failure. So it is highly recommended to disable
428system key shortcuts prior testing. The steps to access and disable system key shortcuts
429for various platforms are provided below.
430
431#### MacOS
432Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
433select or deselect desired shortcut.
434
435For example, test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java fails
436on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip message
437but the key combination is reserved by the operating system. To run the test correctly
438the default global key shortcut should be disabled using the steps described above, and then deselect
439"Turn keyboard access on or off" option which is responsible for `CTRL + F1` combination.
440
441#### Linux
442Open the Activities overview and start typing Settings; Choose Settings, click Devices,
443then click Keyboard; set or override desired shortcut.
444
445#### Windows
446Type `gpedit` in the Search and then click Edit group policy; navigate to
447User Configuration -> Administrative Templates -> Windows Components -> File Explorer;
448in the right-side pane look for "Turn off Windows key hotkeys" and double click on it;
449enable or disable hotkeys.
450
451Note: restart is required to make the settings take effect.
452
453---
454# Override some definitions in the global css file that are not optimal for
455# this document.
456header-includes:
457 - '<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>'
458---
459