1% Testing the JDK 2 3## Using "make test" (the run-test framework) 4 5This new way of running tests is developer-centric. It assumes that you have 6built a JDK locally and want to test it. Running common test targets is simple, 7and more complex ad-hoc combination of tests is possible. The user interface is 8forgiving, and clearly report errors it cannot resolve. 9 10The main target `test` uses the jdk-image as the tested product. There is 11also an alternate target `exploded-test` that uses the exploded image 12instead. Not all tests will run successfully on the exploded image, but using 13this target can greatly improve rebuild times for certain workflows. 14 15Previously, `make test` was used to invoke an old system for running tests, and 16`make run-test` was used for the new test framework. For backward compatibility 17with scripts and muscle memory, `run-test` (and variants like 18`exploded-run-test` or `run-test-tier1`) are kept as aliases. 19 20Some example command-lines: 21 22 $ make test-tier1 23 $ make test-jdk_lang JTREG="JOBS=8" 24 $ make test TEST=jdk_lang 25 $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1" 26 $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug" 27 $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java" 28 $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2" 29 $ make exploded-test TEST=tier2 30 31### Configuration 32 33To be able to run JTReg tests, `configure` needs to know where to find the 34JTReg test framework. If it is not picked up automatically by configure, use 35the `--with-jtreg=<path to jtreg home>` option to point to the JTReg framework. 36Note that this option should point to the JTReg home, i.e. the top directory, 37containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME` 38environment variable to point to the JTReg home before running `configure`.) 39 40To be able to run microbenchmarks, `configure` needs to know where to find the 41JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory 42containing the core JMH and transitive dependencies. The recommended 43dependencies can be retrieved by running `sh make/devkit/createJMHBundle.sh`, 44after which `--with-jmh=build/jmh/jars` should work. 45 46## Test selection 47 48All functionality is available using the `test` make target. In this use case, 49the test or tests to be executed is controlled using the `TEST` variable. To 50speed up subsequent test runs with no source code changes, `test-only` can be 51used instead, which do not depend on the source and test image build. 52 53For some common top-level tests, direct make targets have been generated. This 54includes all JTReg test groups, the hotspot gtest, and custom tests (if 55present). This means that `make test-tier1` is equivalent to `make test 56TEST="tier1"`, but the latter is more tab-completion friendly. For more complex 57test runs, the `test TEST="x"` solution needs to be used. 58 59The test specifications given in `TEST` is parsed into fully qualified test 60descriptors, which clearly and unambigously show which tests will be run. As an 61example, `:tier1` will expand to `jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1 62jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1 63jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1`. You can 64always submit a list of fully qualified test descriptors in the `TEST` variable 65if you want to shortcut the parser. 66 67### JTReg 68 69JTReg tests can be selected either by picking a JTReg test group, or a selection 70of files or directories containing JTReg tests. 71 72JTReg test groups can be specified either without a test root, e.g. `:tier1` 73(or `tier1`, the initial colon is optional), or with, e.g. `hotspot:tier1`, 74`test/jdk:jdk_util` or `$(TOPDIR)/test/hotspot/jtreg:hotspot_all`. The test 75root can be specified either as an absolute path, or a path relative to the 76JDK top directory, or the `test` directory. For simplicity, the hotspot 77JTReg test root, which really is `hotspot/jtreg` can be abbreviated as 78just `hotspot`. 79 80When specified without a test root, all matching groups from all test roots 81will be added. Otherwise, only the group from the specified test root will be 82added. 83 84Individual JTReg tests or directories containing JTReg tests can also be 85specified, like `test/hotspot/jtreg/native_sanity/JniVersion.java` or 86`hotspot/jtreg/native_sanity`. Just like for test root selection, you can 87either specify an absolute path (which can even point to JTReg tests outside 88the source tree), or a path relative to either the JDK top directory or the 89`test` directory. `hotspot` can be used as an alias for `hotspot/jtreg` here as 90well. 91 92As long as the test groups or test paths can be uniquely resolved, you do not 93need to enter the `jtreg:` prefix. If this is not possible, or if you want to 94use a fully qualified test descriptor, add `jtreg:`, e.g. 95`jtreg:test/hotspot/jtreg/native_sanity`. 96 97### Gtest 98 99Since the Hotspot Gtest suite is so quick, the default is to run all tests. 100This is specified by just `gtest`, or as a fully qualified test descriptor 101`gtest:all`. 102 103If you want, you can single out an individual test or a group of tests, for 104instance `gtest:LogDecorations` or `gtest:LogDecorations.level_test_vm`. This 105can be particularly useful if you want to run a shaky test repeatedly. 106 107For Gtest, there is a separate test suite for each JVM variant. The JVM variant 108is defined by adding `/<variant>` to the test descriptor, e.g. 109`gtest:Log/client`. If you specify no variant, gtest will run once for each JVM 110variant present (e.g. server, client). So if you only have the server JVM 111present, then `gtest:all` will be equivalent to `gtest:all/server`. 112 113### Microbenchmarks 114 115Which microbenchmarks to run is selected using a regular expression 116following the `micro:` test descriptor, e.g., `micro:java.lang.reflect`. This 117delegates the test selection to JMH, meaning package name, class name and even 118benchmark method names can be used to select tests. 119 120Using special characters like `|` in the regular expression is possible, but 121needs to be escaped multiple times: `micro:ArrayCopy\\\\\|reflect`. 122 123### Special tests 124 125A handful of odd tests that are not covered by any other testing framework are 126accessible using the `special:` test descriptor. Currently, this includes 127`failure-handler` and `make`. 128 129 * Failure handler testing is run using `special:failure-handler` or just 130 `failure-handler` as test descriptor. 131 132 * Tests for the build system, including both makefiles and related 133 functionality, is run using `special:make` or just `make` as test 134 descriptor. This is equivalent to `special:make:all`. 135 136 A specific make test can be run by supplying it as argument, e.g. 137 `special:make:idea`. As a special syntax, this can also be expressed as 138 `make-idea`, which allows for command lines as `make test-make-idea`. 139 140## Test results and summary 141 142At the end of the test run, a summary of all tests run will be presented. This 143will have a consistent look, regardless of what test suites were used. This is 144a sample summary: 145 146 ============================== 147 Test summary 148 ============================== 149 TEST TOTAL PASS FAIL ERROR 150 >> jtreg:jdk/test:tier1 1867 1865 2 0 << 151 jtreg:langtools/test:tier1 4711 4711 0 0 152 jtreg:nashorn/test:tier1 133 133 0 0 153 ============================== 154 TEST FAILURE 155 156Tests where the number of TOTAL tests does not equal the number of PASSed tests 157will be considered a test failure. These are marked with the `>> ... <<` marker 158for easy identification. 159 160The classification of non-passed tests differs a bit between test suites. In 161the summary, ERROR is used as a catch-all for tests that neither passed nor are 162classified as failed by the framework. This might indicate test framework 163error, timeout or other problems. 164 165In case of test failures, `make test` will exit with a non-zero exit value. 166 167All tests have their result stored in `build/$BUILD/test-results/$TEST_ID`, 168where TEST_ID is a path-safe conversion from the fully qualified test 169descriptor, e.g. for `jtreg:jdk/test:tier1` the TEST_ID is 170`jtreg_jdk_test_tier1`. This path is also printed in the log at the end of the 171test run. 172 173Additional work data is stored in `build/$BUILD/test-support/$TEST_ID`. For 174some frameworks, this directory might contain information that is useful in 175determining the cause of a failed test. 176 177## Test suite control 178 179It is possible to control various aspects of the test suites using make control 180variables. 181 182These variables use a keyword=value approach to allow multiple values to be 183set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg 184concurrency level to 1 and the timeout factor to 8. This is equivalent to 185setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format 186means that the `JTREG` variable is parsed and verified for correctness, so 187`JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8` 188would just pass unnoticed. 189 190To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell 191normally eats `;`, the recommended usage is to write the assignment inside 192qoutes, e.g. `JTREG="...;..."`. This will also make sure spaces are preserved, 193as in `JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"`. 194 195(Other ways are possible, e.g. using backslash: `JTREG=JOBS=1\;TIMEOUT_FACTOR=8`. 196Also, as a special technique, the string `%20` will be replaced with space for 197certain options, e.g. `JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug`. 198This can be useful if you have layers of scripts and have trouble getting 199proper quoting of command line arguments through.) 200 201As far as possible, the names of the keywords have been standardized between 202test suites. 203 204### General keywords (TEST_OPTS) 205 206Some keywords are valid across different test suites. If you want to run tests 207from multiple test suites, or just don't want to care which test suite specific 208control variable to use, then you can use the general TEST_OPTS control 209variable. 210 211There are also some keywords that applies globally to the test runner system, 212not to any specific test suites. These are also available as TEST_OPTS keywords. 213 214#### JOBS 215 216Currently only applies to JTReg. 217 218#### TIMEOUT_FACTOR 219 220Currently only applies to JTReg. 221 222#### JAVA_OPTIONS 223 224Applies to JTReg, GTest and Micro. 225 226#### VM_OPTIONS 227 228Applies to JTReg, GTest and Micro. 229 230#### AOT_MODULES 231 232Applies to JTReg and GTest. 233 234#### JCOV 235 236This keywords applies globally to the test runner system. If set to `true`, it 237enables JCov coverage reporting for all tests run. To be useful, the JDK under 238test must be run with a JDK built with JCov instrumentation (`configure 239--with-jcov=<path to directory containing lib/jcov.jar>`, `make jcov-image`). 240 241The simplest way to run tests with JCov coverage report is to use the special 242target `jcov-test` instead of `test`, e.g. `make jcov-test TEST=jdk_lang`. This 243will make sure the JCov image is built, and that JCov reporting is enabled. 244 245The JCov report is stored in `build/$BUILD/test-results/jcov-output/report`. 246 247Please note that running with JCov reporting can be very memory intensive. 248 249#### JCOV_DIFF_CHANGESET 250 251While collecting code coverage with JCov, it is also possible to find coverage 252for only recently changed code. JCOV_DIFF_CHANGESET specifies a source 253revision. A textual report will be generated showing coverage of the diff 254between the specified revision and the repository tip. 255 256The report is stored in 257`build/$BUILD/test-results/jcov-output/diff_coverage_report` file. 258 259### JTReg keywords 260 261#### JOBS 262 263The test concurrency (`-concurrency`). 264 265Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to 266JOBS, except for Hotspot, where the default is *number of CPU cores/2*, 267but never more than *memory size in GB/2*. 268 269#### TIMEOUT_FACTOR 270 271The timeout factor (`-timeoutFactor`). 272 273Defaults to 4. 274 275#### FAILURE_HANDLER_TIMEOUT 276 277Sets the argument `-timeoutHandlerTimeout` for JTReg. The default value is 0. 278This is only valid if the failure handler is built. 279 280#### TEST_MODE 281 282The test mode (`agentvm` or `othervm`). 283 284Defaults to `agentvm`. 285 286#### ASSERT 287 288Enable asserts (`-ea -esa`, or none). 289 290Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except 291for hotspot. 292 293#### VERBOSE 294 295The verbosity level (`-verbose`). 296 297Defaults to `fail,error,summary`. 298 299#### RETAIN 300 301What test data to retain (`-retain`). 302 303Defaults to `fail,error`. 304 305#### MAX_MEM 306 307Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none). 308 309Limit memory consumption for JTReg test framework and VM under test. Set to 0 310to disable the limits. 311 312Defaults to 512m, except for hotspot, where it defaults to 0 (no limit). 313 314#### MAX_OUTPUT 315 316Set the property `javatest.maxOutputSize` for the launcher, to change the 317default JTReg log limit. 318 319#### KEYWORDS 320 321JTReg keywords sent to JTReg using `-k`. Please be careful in making sure that 322spaces and special characters (like `!`) are properly quoted. To avoid some 323issues, the special value `%20` can be used instead of space. 324 325#### EXTRA_PROBLEM_LISTS 326 327Use additional problem lists file or files, in addition to the default 328ProblemList.txt located at the JTReg test roots. 329 330If multiple file names are specified, they should be separated by space (or, to 331help avoid quoting issues, the special value `%20`). 332 333The file names should be either absolute, or relative to the JTReg test root of 334the tests to be run. 335 336#### RUN_PROBLEM_LISTS 337 338Use the problem lists to select tests instead of excluding them. 339 340Set to `true` or `false`. 341If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used. 342Default is `false`. 343 344#### OPTIONS 345 346Additional options to the JTReg test framework. 347 348Use `JTREG="OPTIONS=--help all"` to see all available JTReg options. 349 350#### JAVA_OPTIONS 351 352Additional Java options for running test classes (sent to JTReg as 353`-javaoption`). 354 355#### VM_OPTIONS 356 357Additional Java options to be used when compiling and running classes (sent to 358JTReg as `-vmoption`). 359 360This option is only needed in special circumstances. To pass Java options to 361your test classes, use `JAVA_OPTIONS`. 362 363#### LAUNCHER_OPTIONS 364 365Additional Java options that are sent to the java launcher that starts the 366JTReg harness. 367 368#### AOT_MODULES 369 370Generate AOT modules before testing for the specified module, or set of 371modules. If multiple modules are specified, they should be separated by space 372(or, to help avoid quoting issues, the special value `%20`). 373 374#### RETRY_COUNT 375 376Retry failed tests up to a set number of times. Defaults to 0. 377 378### Gtest keywords 379 380#### REPEAT 381 382The number of times to repeat the tests (`--gtest_repeat`). 383 384Default is 1. Set to -1 to repeat indefinitely. This can be especially useful 385combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent 386problem. 387 388#### OPTIONS 389 390Additional options to the Gtest test framework. 391 392Use `GTEST="OPTIONS=--help"` to see all available Gtest options. 393 394#### AOT_MODULES 395 396Generate AOT modules before testing for the specified module, or set of 397modules. If multiple modules are specified, they should be separated by space 398(or, to help avoid quoting issues, the special value `%20`). 399 400### Microbenchmark keywords 401 402#### FORK 403 404Override the number of benchmark forks to spawn. Same as specifying `-f <num>`. 405 406#### ITER 407 408Number of measurement iterations per fork. Same as specifying `-i <num>`. 409 410#### TIME 411 412Amount of time to spend in each measurement iteration, in seconds. Same as 413specifying `-r <num>` 414 415#### WARMUP_ITER 416 417Number of warmup iterations to run before the measurement phase in each fork. 418Same as specifying `-wi <num>`. 419 420#### WARMUP_TIME 421 422Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`. 423 424#### RESULTS_FORMAT 425 426Specify to have the test run save a log of the values. Accepts the same values 427as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`. 428 429#### VM_OPTIONS 430 431Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>` 432 433#### OPTIONS 434 435Additional arguments to send to JMH. 436 437## Notes for Specific Tests 438 439### Docker Tests 440 441Docker tests with default parameters may fail on systems with glibc versions 442not compatible with the one used in the default docker image (e.g., Oracle 443Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu 44418.04 if run like this on x86: 445 446``` 447$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" 448``` 449 450To run these tests correctly, additional parameters for the correct docker 451image are required on Ubuntu 18.04 by using `JAVA_OPTIONS`. 452 453``` 454$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \ 455 JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu 456 -Djdk.test.docker.image.version=latest" 457``` 458 459### Non-US locale 460 461If your locale is non-US, some tests are likely to fail. To work around this 462you can set the locale to US. On Unix platforms simply setting `LANG="en_US"` 463in the environment before running tests should work. On Windows, setting 464`JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but 465not all test cases. 466 467For example: 468 469``` 470$ export LANG="en_US" && make test TEST=... 471$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=... 472``` 473 474### PKCS11 Tests 475 476It is highly recommended to use the latest NSS version when running PKCS11 477tests. Improper NSS version may lead to unexpected failures which are hard to 478diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail 479on Ubuntu 18.04 with the default NSS version in the system. To run these tests 480correctly, the system property `test.nss.lib.paths` is required on Ubuntu 18.04 481to specify the alternative NSS lib directories. 482 483For example: 484 485``` 486$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \ 487 JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs" 488``` 489 490For more notes about the PKCS11 tests, please refer to 491test/jdk/sun/security/pkcs11/README. 492 493### Client UI Tests 494 495Some Client UI tests use key sequences which may be reserved by the operating 496system. Usually that causes the test failure. So it is highly recommended to 497disable system key shortcuts prior testing. The steps to access and disable 498system key shortcuts for various platforms are provided below. 499 500#### MacOS 501 502Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts; 503select or deselect desired shortcut. 504 505For example, 506test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java 507fails on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip 508message but the key combination is reserved by the operating system. To run the 509test correctly the default global key shortcut should be disabled using the 510steps described above, and then deselect "Turn keyboard access on or off" 511option which is responsible for `CTRL + F1` combination. 512 513#### Linux 514 515Open the Activities overview and start typing Settings; Choose Settings, click 516Devices, then click Keyboard; set or override desired shortcut. 517 518#### Windows 519 520Type `gpedit` in the Search and then click Edit group policy; navigate to User 521Configuration -> Administrative Templates -> Windows Components -> File 522Explorer; in the right-side pane look for "Turn off Windows key hotkeys" and 523double click on it; enable or disable hotkeys. 524 525Note: restart is required to make the settings take effect. 526 527--- 528# Override some definitions in the global css file that are not optimal for 529# this document. 530header-includes: 531 - '<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>' 532--- 533