1% Testing the JDK
2
3## Using "make test" (the run-test framework)
4
5This new way of running tests is developer-centric. It assumes that you have
6built a JDK locally and want to test it. Running common test targets is simple,
7and more complex ad-hoc combination of tests is possible. The user interface is
8forgiving, and clearly report errors it cannot resolve.
9
10The main target `test` uses the jdk-image as the tested product. There is
11also an alternate target `exploded-test` that uses the exploded image
12instead. Not all tests will run successfully on the exploded image, but using
13this target can greatly improve rebuild times for certain workflows.
14
15Previously, `make test` was used to invoke an old system for running tests, and
16`make run-test` was used for the new test framework. For backward compatibility
17with scripts and muscle memory, `run-test` (and variants like
18`exploded-run-test` or `run-test-tier1`) are kept as aliases.
19
20Some example command-lines:
21
22    $ make test-tier1
23    $ make test-jdk_lang JTREG="JOBS=8"
24    $ make test TEST=jdk_lang
25    $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1"
26    $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"
27    $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java"
28    $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2"
29    $ make exploded-test TEST=tier2
30
31### Configuration
32
33To be able to run JTReg tests, `configure` needs to know where to find the
34JTReg test framework. If it is not picked up automatically by configure, use
35the `--with-jtreg=<path to jtreg home>` option to point to the JTReg framework.
36Note that this option should point to the JTReg home, i.e. the top directory,
37containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
38environment variable to point to the JTReg home before running `configure`.)
39
40To be able to run microbenchmarks, `configure` needs to know where to find the
41JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
42containing the core JMH and transitive dependencies. The recommended
43dependencies can be retrieved by running `sh make/devkit/createJMHBundle.sh`,
44after which `--with-jmh=build/jmh/jars` should work.
45
46## Test selection
47
48All functionality is available using the `test` make target. In this use case,
49the test or tests to be executed is controlled using the `TEST` variable. To
50speed up subsequent test runs with no source code changes, `test-only` can be
51used instead, which do not depend on the source and test image build.
52
53For some common top-level tests, direct make targets have been generated. This
54includes all JTReg test groups, the hotspot gtest, and custom tests (if
55present). This means that `make test-tier1` is equivalent to `make test
56TEST="tier1"`, but the latter is more tab-completion friendly. For more complex
57test runs, the `test TEST="x"` solution needs to be used.
58
59The test specifications given in `TEST` is parsed into fully qualified test
60descriptors, which clearly and unambigously show which tests will be run. As an
61example, `:tier1` will expand to `jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1
62jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1
63jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1`. You can
64always submit a list of fully qualified test descriptors in the `TEST` variable
65if you want to shortcut the parser.
66
67### Common Test Groups
68
69Ideally, all tests are run for every change but this may not be practical due to the limited
70testing resources, the scope of the change, etc.
71
72The source tree currently defines a few common test groups in the relevant `TEST.groups`
73files. There are test groups that cover a specific component, for example `hotspot_gc`.
74It is a good idea to look into `TEST.groups` files to get a sense what tests are relevant
75to a particular JDK component.
76
77Component-specific tests may miss some unintended consequences of a change, so other
78tests should also be run. Again, it might be impractical to run all tests, and therefore
79_tiered_ test groups exist. Tiered test groups are not component-specific, but rather cover
80the significant parts of the entire JDK.
81
82Multiple tiers allow balancing test coverage and testing costs. Lower test tiers are supposed to
83contain the simpler, quicker and more stable tests. Higher tiers are supposed to contain
84progressively more thorough, slower, and sometimes less stable tests, or the tests that require
85special configuration.
86
87Contributors are expected to run the tests for the areas that are changed, and the first N tiers
88they can afford to run, but at least tier1.
89
90A brief description of the tiered test groups:
91
92- `tier1`: This is the lowest test tier. Multiple developers run these tests every day.
93Because of the widespread use, the tests in `tier1` are carefully selected and optimized to run
94fast, and to run in the most stable manner. The test failures in `tier1` are usually followed up
95on quickly, either with fixes, or adding relevant tests to problem list. GitHub Actions workflows,
96if enabled, run `tier1` tests.
97
98- `tier2`: This test group covers even more ground. These contain, among other things,
99tests that either run for too long to be at `tier1`, or may require special configuration,
100or tests that are less stable, or cover the broader range of non-core JVM and JDK features/components
101(for example, XML).
102
103- `tier3`: This test group includes more stressful tests, the tests for corner cases
104not covered by previous tiers, plus the tests that require GUIs. As such, this suite
105should either be run with low concurrency (`TEST_JOBS=1`), or without headful tests
106(`JTREG_KEYWORDS=\!headful`), or both.
107
108- `tier4`: This test group includes every other test not covered by previous tiers. It includes,
109for example, `vmTestbase` suites for Hotspot, which run for many hours even on large
110machines. It also runs GUI tests, so the same `TEST_JOBS` and `JTREG_KEYWORDS` caveats
111apply.
112
113### JTReg
114
115JTReg tests can be selected either by picking a JTReg test group, or a selection
116of files or directories containing JTReg tests.
117
118JTReg test groups can be specified either without a test root, e.g. `:tier1`
119(or `tier1`, the initial colon is optional), or with, e.g. `hotspot:tier1`,
120`test/jdk:jdk_util` or `$(TOPDIR)/test/hotspot/jtreg:hotspot_all`. The test
121root can be specified either as an absolute path, or a path relative to the
122JDK top directory, or the `test` directory. For simplicity, the hotspot
123JTReg test root, which really is `hotspot/jtreg` can be abbreviated as
124just `hotspot`.
125
126When specified without a test root, all matching groups from all test roots
127will be added. Otherwise, only the group from the specified test root will be
128added.
129
130Individual JTReg tests or directories containing JTReg tests can also be
131specified, like `test/hotspot/jtreg/native_sanity/JniVersion.java` or
132`hotspot/jtreg/native_sanity`. Just like for test root selection, you can
133either specify an absolute path (which can even point to JTReg tests outside
134the source tree), or a path relative to either the JDK top directory or the
135`test` directory. `hotspot` can be used as an alias for `hotspot/jtreg` here as
136well.
137
138As long as the test groups or test paths can be uniquely resolved, you do not
139need to enter the `jtreg:` prefix. If this is not possible, or if you want to
140use a fully qualified test descriptor, add `jtreg:`, e.g.
141`jtreg:test/hotspot/jtreg/native_sanity`.
142
143### Gtest
144
145Since the Hotspot Gtest suite is so quick, the default is to run all tests.
146This is specified by just `gtest`, or as a fully qualified test descriptor
147`gtest:all`.
148
149If you want, you can single out an individual test or a group of tests, for
150instance `gtest:LogDecorations` or `gtest:LogDecorations.level_test_vm`. This
151can be particularly useful if you want to run a shaky test repeatedly.
152
153For Gtest, there is a separate test suite for each JVM variant. The JVM variant
154is defined by adding `/<variant>` to the test descriptor, e.g.
155`gtest:Log/client`. If you specify no variant, gtest will run once for each JVM
156variant present (e.g. server, client). So if you only have the server JVM
157present, then `gtest:all` will be equivalent to `gtest:all/server`.
158
159### Microbenchmarks
160
161Which microbenchmarks to run is selected using a regular expression
162following the `micro:` test descriptor, e.g., `micro:java.lang.reflect`. This
163delegates the test selection to JMH, meaning package name, class name and even
164benchmark method names can be used to select tests.
165
166Using special characters like `|` in the regular expression is possible, but
167needs to be escaped multiple times: `micro:ArrayCopy\\\\\|reflect`.
168
169### Special tests
170
171A handful of odd tests that are not covered by any other testing framework are
172accessible using the `special:` test descriptor. Currently, this includes
173`failure-handler` and `make`.
174
175  * Failure handler testing is run using `special:failure-handler` or just
176    `failure-handler` as test descriptor.
177
178  * Tests for the build system, including both makefiles and related
179    functionality, is run using `special:make` or just `make` as test
180    descriptor. This is equivalent to `special:make:all`.
181
182    A specific make test can be run by supplying it as argument, e.g.
183    `special:make:idea`. As a special syntax, this can also be expressed as
184    `make-idea`, which allows for command lines as `make test-make-idea`.
185
186## Test results and summary
187
188At the end of the test run, a summary of all tests run will be presented. This
189will have a consistent look, regardless of what test suites were used. This is
190a sample summary:
191
192    ==============================
193    Test summary
194    ==============================
195       TEST                                          TOTAL  PASS  FAIL ERROR
196    >> jtreg:jdk/test:tier1                           1867  1865     2     0 <<
197       jtreg:langtools/test:tier1                     4711  4711     0     0
198       jtreg:nashorn/test:tier1                        133   133     0     0
199    ==============================
200    TEST FAILURE
201
202Tests where the number of TOTAL tests does not equal the number of PASSed tests
203will be considered a test failure. These are marked with the `>> ... <<` marker
204for easy identification.
205
206The classification of non-passed tests differs a bit between test suites. In
207the summary, ERROR is used as a catch-all for tests that neither passed nor are
208classified as failed by the framework. This might indicate test framework
209error, timeout or other problems.
210
211In case of test failures, `make test` will exit with a non-zero exit value.
212
213All tests have their result stored in `build/$BUILD/test-results/$TEST_ID`,
214where TEST_ID is a path-safe conversion from the fully qualified test
215descriptor, e.g. for `jtreg:jdk/test:tier1` the TEST_ID is
216`jtreg_jdk_test_tier1`. This path is also printed in the log at the end of the
217test run.
218
219Additional work data is stored in `build/$BUILD/test-support/$TEST_ID`. For
220some frameworks, this directory might contain information that is useful in
221determining the cause of a failed test.
222
223## Test suite control
224
225It is possible to control various aspects of the test suites using make control
226variables.
227
228These variables use a keyword=value approach to allow multiple values to be
229set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
230concurrency level to 1 and the timeout factor to 8. This is equivalent to
231setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format
232means that the `JTREG` variable is parsed and verified for correctness, so
233`JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8`
234would just pass unnoticed.
235
236To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
237normally eats `;`, the recommended usage is to write the assignment inside
238qoutes, e.g. `JTREG="...;..."`. This will also make sure spaces are preserved,
239as in `JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"`.
240
241(Other ways are possible, e.g. using backslash: `JTREG=JOBS=1\;TIMEOUT_FACTOR=8`.
242Also, as a special technique, the string `%20` will be replaced with space for
243certain options, e.g. `JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug`.
244This can be useful if you have layers of scripts and have trouble getting
245proper quoting of command line arguments through.)
246
247As far as possible, the names of the keywords have been standardized between
248test suites.
249
250### General keywords (TEST_OPTS)
251
252Some keywords are valid across different test suites. If you want to run tests
253from multiple test suites, or just don't want to care which test suite specific
254control variable to use, then you can use the general TEST_OPTS control
255variable.
256
257There are also some keywords that applies globally to the test runner system,
258not to any specific test suites. These are also available as TEST_OPTS keywords.
259
260#### JOBS
261
262Currently only applies to JTReg.
263
264#### TIMEOUT_FACTOR
265
266Currently only applies to JTReg.
267
268#### JAVA_OPTIONS
269
270Applies to JTReg, GTest and Micro.
271
272#### VM_OPTIONS
273
274Applies to JTReg, GTest and Micro.
275
276#### AOT_MODULES
277
278Applies to JTReg and GTest.
279
280#### JCOV
281
282This keywords applies globally to the test runner system. If set to `true`, it
283enables JCov coverage reporting for all tests run. To be useful, the JDK under
284test must be run with a JDK built with JCov instrumentation (`configure
285--with-jcov=<path to directory containing lib/jcov.jar>`, `make jcov-image`).
286
287The simplest way to run tests with JCov coverage report is to use the special
288target `jcov-test` instead of `test`, e.g. `make jcov-test TEST=jdk_lang`. This
289will make sure the JCov image is built, and that JCov reporting is enabled.
290
291The JCov report is stored in `build/$BUILD/test-results/jcov-output/report`.
292
293Please note that running with JCov reporting can be very memory intensive.
294
295#### JCOV_DIFF_CHANGESET
296
297While collecting code coverage with JCov, it is also possible to find coverage
298for only recently changed code. JCOV_DIFF_CHANGESET specifies a source
299revision. A textual report will be generated showing coverage of the diff
300between the specified revision and the repository tip.
301
302The report is stored in
303`build/$BUILD/test-results/jcov-output/diff_coverage_report` file.
304
305### JTReg keywords
306
307#### JOBS
308
309The test concurrency (`-concurrency`).
310
311Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
312JOBS, except for Hotspot, where the default is *number of CPU cores/2*,
313but never more than *memory size in GB/2*.
314
315#### TIMEOUT_FACTOR
316
317The timeout factor (`-timeoutFactor`).
318
319Defaults to 4.
320
321#### FAILURE_HANDLER_TIMEOUT
322
323Sets the argument `-timeoutHandlerTimeout` for JTReg. The default value is 0.
324This is only valid if the failure handler is built.
325
326#### TEST_MODE
327
328The test mode (`agentvm` or `othervm`).
329
330Defaults to `agentvm`.
331
332#### ASSERT
333
334Enable asserts (`-ea -esa`, or none).
335
336Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
337for hotspot.
338
339#### VERBOSE
340
341The verbosity level (`-verbose`).
342
343Defaults to `fail,error,summary`.
344
345#### RETAIN
346
347What test data to retain (`-retain`).
348
349Defaults to `fail,error`.
350
351#### MAX_MEM
352
353Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
354
355Limit memory consumption for JTReg test framework and VM under test. Set to 0
356to disable the limits.
357
358Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
359
360#### MAX_OUTPUT
361
362Set the property `javatest.maxOutputSize` for the launcher, to change the
363default JTReg log limit.
364
365#### KEYWORDS
366
367JTReg keywords sent to JTReg using `-k`. Please be careful in making sure that
368spaces and special characters (like `!`) are properly quoted. To avoid some
369issues, the special value `%20` can be used instead of space.
370
371#### EXTRA_PROBLEM_LISTS
372
373Use additional problem lists file or files, in addition to the default
374ProblemList.txt located at the JTReg test roots.
375
376If multiple file names are specified, they should be separated by space (or, to
377help avoid quoting issues, the special value `%20`).
378
379The file names should be either absolute, or relative to the JTReg test root of
380the tests to be run.
381
382#### RUN_PROBLEM_LISTS
383
384Use the problem lists to select tests instead of excluding them.
385
386Set to `true` or `false`.
387If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used.
388Default is `false`.
389
390#### OPTIONS
391
392Additional options to the JTReg test framework.
393
394Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
395
396#### JAVA_OPTIONS
397
398Additional Java options for running test classes (sent to JTReg as
399`-javaoption`).
400
401#### VM_OPTIONS
402
403Additional Java options to be used when compiling and running classes (sent to
404JTReg as `-vmoption`).
405
406This option is only needed in special circumstances. To pass Java options to
407your test classes, use `JAVA_OPTIONS`.
408
409#### LAUNCHER_OPTIONS
410
411Additional Java options that are sent to the java launcher that starts the
412JTReg harness.
413
414#### AOT_MODULES
415
416Generate AOT modules before testing for the specified module, or set of
417modules. If multiple modules are specified, they should be separated by space
418(or, to help avoid quoting issues, the special value `%20`).
419
420#### RETRY_COUNT
421
422Retry failed tests up to a set number of times. Defaults to 0.
423
424### Gtest keywords
425
426#### REPEAT
427
428The number of times to repeat the tests (`--gtest_repeat`).
429
430Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
431combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent
432problem.
433
434#### OPTIONS
435
436Additional options to the Gtest test framework.
437
438Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
439
440#### AOT_MODULES
441
442Generate AOT modules before testing for the specified module, or set of
443modules. If multiple modules are specified, they should be separated by space
444(or, to help avoid quoting issues, the special value `%20`).
445
446### Microbenchmark keywords
447
448#### FORK
449
450Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
451
452#### ITER
453
454Number of measurement iterations per fork. Same as specifying `-i <num>`.
455
456#### TIME
457
458Amount of time to spend in each measurement iteration, in seconds. Same as
459specifying `-r <num>`
460
461#### WARMUP_ITER
462
463Number of warmup iterations to run before the measurement phase in each fork.
464Same as specifying `-wi <num>`.
465
466#### WARMUP_TIME
467
468Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`.
469
470#### RESULTS_FORMAT
471
472Specify to have the test run save a log of the values. Accepts the same values
473as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
474
475#### VM_OPTIONS
476
477Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
478
479#### OPTIONS
480
481Additional arguments to send to JMH.
482
483## Notes for Specific Tests
484
485### Docker Tests
486
487Docker tests with default parameters may fail on systems with glibc versions
488not compatible with the one used in the default docker image (e.g., Oracle
489Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu
49018.04 if run like this on x86:
491
492```
493$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
494```
495
496To run these tests correctly, additional parameters for the correct docker
497image are required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
498
499```
500$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \
501    JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu
502    -Djdk.test.docker.image.version=latest"
503```
504
505### Non-US locale
506
507If your locale is non-US, some tests are likely to fail. To work around this
508you can set the locale to US. On Unix platforms simply setting `LANG="en_US"`
509in the environment before running tests should work. On Windows, setting
510`JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but
511not all test cases.
512
513For example:
514
515```
516$ export LANG="en_US" && make test TEST=...
517$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
518```
519
520### PKCS11 Tests
521
522It is highly recommended to use the latest NSS version when running PKCS11
523tests. Improper NSS version may lead to unexpected failures which are hard to
524diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail
525on Ubuntu 18.04 with the default NSS version in the system. To run these tests
526correctly, the system property `test.nss.lib.paths` is required on Ubuntu 18.04
527to specify the alternative NSS lib directories.
528
529For example:
530
531```
532$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \
533    JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"
534```
535
536For more notes about the PKCS11 tests, please refer to
537test/jdk/sun/security/pkcs11/README.
538
539### Client UI Tests
540
541Some Client UI tests use key sequences which may be reserved by the operating
542system. Usually that causes the test failure. So it is highly recommended to
543disable system key shortcuts prior testing. The steps to access and disable
544system key shortcuts for various platforms are provided below.
545
546#### MacOS
547
548Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
549select or deselect desired shortcut.
550
551For example,
552test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java
553fails on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip
554message but the key combination is reserved by the operating system. To run the
555test correctly the default global key shortcut should be disabled using the
556steps described above, and then deselect "Turn keyboard access on or off"
557option which is responsible for `CTRL + F1` combination.
558
559#### Linux
560
561Open the Activities overview and start typing Settings; Choose Settings, click
562Devices, then click Keyboard; set or override desired shortcut.
563
564#### Windows
565
566Type `gpedit` in the Search and then click Edit group policy; navigate to User
567Configuration -> Administrative Templates -> Windows Components -> File
568Explorer; in the right-side pane look for "Turn off Windows key hotkeys" and
569double click on it; enable or disable hotkeys.
570
571Note: restart is required to make the settings take effect.
572
573---
574# Override some definitions in the global css file that are not optimal for
575# this document.
576header-includes:
577 - '<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>'
578---
579