• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..07-Mar-2022-

acl/H07-Mar-2022-763590

additional/H07-Mar-2022-1,000829

addzone/H07-Mar-2022-1,5031,263

allow-query/H07-Mar-2022-2,2671,772

auth/H07-Mar-2022-384304

autosign/H07-Mar-2022-4,1773,396

builtin/H07-Mar-2022-384312

cacheclean/H07-Mar-2022-5,3035,186

case/H07-Mar-2022-348264

catz/H07-Mar-2022-2,1471,812

cds/H07-Mar-2022-467303

chain/H07-Mar-2022-1,7671,295

checkconf/H07-Mar-2022-7,2476,356

checkds/H07-Mar-2022-1,238904

checkdstool/H07-May-2022-382295

checknames/H07-Mar-2022-714592

checkzone/H07-Mar-2022-1,083923

common/H07-Mar-2022-9984

cookie/H07-Mar-2022-1,4951,216

coverage/H07-Mar-2022-440308

database/H07-Mar-2022-176112

delzone/H07-Mar-2022-264197

dialup/H07-Mar-2022-275216

digdelv/H07-Mar-2022-2,0041,569

dlz/H07-Mar-2022-216123

dlzexternal/H03-May-2022-1,363945

dns64/H07-Mar-2022-2,1431,834

dnssec/H07-Mar-2022-10,1728,210

dnstap/H03-May-2022-1,6901,433

dscp/H07-Mar-2022-404320

dsdigest/H07-Mar-2022-423288

dupsigs/H07-Mar-2022-450309

dyndb/H07-Mar-2022-2,8431,596

ecdsa/H07-Mar-2022-303209

eddsa/H07-Mar-2022-514345

ednscompliance/H07-Mar-2022-203143

emptyzones/H07-Mar-2022-238173

fetchlimit/H07-Mar-2022-635484

filter-aaaa/H07-Mar-2022-2,3881,972

formerr/H07-Mar-2022-265153

forward/H07-Mar-2022-1,7761,352

geoip2/H07-Mar-2022-3,1672,835

glue/H07-Mar-2022-265180

idna/H07-Mar-2022-474204

inline/H07-Mar-2022-2,9942,435

integrity/H07-Mar-2022-320244

ixfr/H07-Mar-2022-723510

journal/H07-Mar-2022-1,2191,108

kasp/H07-Mar-2022-8,5865,677

keepalive/H07-Mar-2022-316239

keymgr/H07-Mar-2022-1,064814

keymgr2kasp/H07-Mar-2022-1,9151,286

legacy/H07-Mar-2022-1,4811,273

limits/H07-Mar-2022-33,48333,378

logfileconfig/H07-Mar-2022-1,043869

masterfile/H07-Mar-2022-385298

masterformat/H07-Mar-2022-822663

metadata/H07-Mar-2022-355258

mirror/H07-Mar-2022-1,067766

mkeys/H07-Mar-2022-1,4731,100

names/H07-Mar-2022-182125

notify/H07-Mar-2022-1,2231,086

nsec3/H07-Mar-2022-725514

nslookup/H07-Mar-2022-220158

nsupdate/H07-Mar-2022-3,7703,112

nzd2nzf/H07-Mar-2022-200125

padding/H07-Mar-2022-396310

pending/H07-Mar-2022-641473

pipelined/H07-Mar-2022-1,033771

pkcs11/H07-Mar-2022-352249

qmin/H07-Mar-2022-1,5941,232

reclimit/H07-Mar-2022-1,092816

redirect/H07-Mar-2022-1,5231,247

resolver/H07-Mar-2022-2,5442,097

rndc/H07-Mar-2022-1,4321,175

rootkeysentinel/H07-Mar-2022-637483

rpz/H07-Mar-2022-3,2932,621

rpzrecurse/H07-Mar-2022-2,4301,952

rrchecker/H07-May-2022-185149

rrl/H07-Mar-2022-778561

rrsetorder/H07-Mar-2022-963797

rsabigexponent/H07-Mar-2022-846626

runtime/H07-May-2022-555451

serve-stale/H07-Mar-2022-3,3382,631

sfcache/H07-Mar-2022-532350

shutdown/H07-Mar-2022-543361

smartsign/H07-Mar-2022-442348

sortlist/H07-Mar-2022-199135

spf/H07-Mar-2022-14488

staticstub/H07-Mar-2022-1,290829

statistics/H07-Mar-2022-834643

statschannel/H07-Mar-2022-1,8431,254

stress/H07-Mar-2022-455283

stub/H07-Mar-2022-446347

synthfromdnssec/H07-Mar-2022-637497

tcp/H07-Mar-2022-717516

timeouts/H07-Mar-2022-552351

tkey/H07-Mar-2022-900653

tools/H07-Mar-2022-14288

tsig/H07-Mar-2022-713559

tsiggss/H07-Mar-2022-464320

ttl/H07-Mar-2022-236168

unknown/H07-Mar-2022-3,6823,549

upforwd/H07-Mar-2022-936656

verify/H07-Mar-2022-425312

views/H07-Mar-2022-927751

wildcard/H07-Mar-2022-864665

win32/H07-Mar-2022-1,0151,008

xfer/H07-Mar-2022-1,7571,444

xferquota/H07-Mar-2022-480361

zero/H07-Mar-2022-488359

zonechecks/H07-Mar-2022-537432

Makefile.inH A D07-Mar-20222.8 KiB10953

READMEH A D07-Mar-202228.1 KiB725526

ans.plH A D07-Mar-202214.4 KiB532350

ckdnsrps.shH A D07-Mar-20224.1 KiB169118

cleanall.shH A D07-Mar-2022950 3815

cleanpkcs11.shH A D07-Mar-2022558 193

conf.sh.commonH A D07-Mar-202219.4 KiB766500

conf.sh.inH A D07-Mar-20223.9 KiB13879

conf.sh.win32H A D07-Mar-20224.2 KiB13071

digcomp.plH A D07-Mar-20223.5 KiB165139

ditch.plH A D07-Mar-20222.2 KiB8846

feature-test.cH A D07-Mar-20224.6 KiB211173

fromhex.plH A D07-Mar-20221 KiB4825

genzone.shH A D07-Mar-202212.4 KiB512391

ifconfig.batH A D07-Mar-20221.7 KiB5043

ifconfig.shH A D07-Mar-20227.2 KiB255220

kasp.shH A D07-Mar-202242.9 KiB1,184843

org.isc.bind.systemH A D07-Mar-2022552 195

org.isc.bind.system.plistH A D07-Mar-2022425 1816

packet.plH A D07-Mar-20223.9 KiB16594

parallel.shH A D07-Mar-20221.3 KiB3721

resolve.cH A D07-Mar-202212 KiB499435

run.gdbH A D07-Mar-202225 21

run.shH A D07-Mar-202210.3 KiB342265

runall.shH A D07-Mar-20222.9 KiB10856

runsequential.shH A D07-Mar-2022805 286

send.plH A D07-Mar-2022848 3411

setup.shH A D07-Mar-2022761 3510

start.plH A D07-Mar-202210.2 KiB442273

start.shH A D07-Mar-2022597 204

stop.plH A D07-Mar-20226.2 KiB289179

stop.shH A D07-Mar-2022596 204

stopall.shH A D07-Mar-2022570 256

system-test-driver.shH A D07-Mar-20221.9 KiB8048

testcrypto.shH A D07-Mar-20222 KiB7759

testsock.plH A D07-Mar-20221 KiB4524

testsock6.plH A D07-Mar-2022743 2610

testsummary.shH A D07-Mar-20222.8 KiB8748

README

1Copyright (C) Internet Systems Consortium, Inc. ("ISC")
2
3SPDX-License-Identifier: MPL-2.0
4
5This Source Code Form is subject to the terms of the Mozilla Public
6License, v. 2.0.  If a copy of the MPL was not distributed with this
7file, you can obtain one at https://mozilla.org/MPL/2.0/.
8
9See the COPYRIGHT file distributed with this work for additional
10information regarding copyright ownership.
11
12Introduction
13===
14This directory holds a simple test environment for running bind9 system tests
15involving multiple name servers.
16
17With the exception of "common" (which holds configuration information common to
18multiple tests) and "win32" (which holds files needed to run the tests in a
19Windows environment), each directory holds a set of scripts and configuration
20files to test different parts of BIND.  The directories are named for the
21aspect of BIND they test, for example:
22
23  dnssec/       DNSSEC tests
24  forward/      Forwarding tests
25  glue/         Glue handling tests
26
27etc.
28
29Typically each set of tests sets up 2-5 name servers and then performs one or
30more tests against them.  Within the test subdirectory, each name server has a
31separate subdirectory containing its configuration data.  These subdirectories
32are named "nsN" or "ansN" (where N is a number between 1 and 8, e.g. ns1, ans2
33etc.)
34
35The tests are completely self-contained and do not require access to the real
36DNS.  Generally, one of the test servers (usually ns1) is set up as a root
37nameserver and is listed in the hints file of the others.
38
39
40Preparing to Run the Tests
41===
42To enable all servers to run on the same machine, they bind to separate virtual
43IP addresses on the loopback interface.  ns1 runs on 10.53.0.1, ns2 on
4410.53.0.2, etc.  Before running any tests, you must set up these addresses by
45running the command
46
47    sh ifconfig.sh up
48
49as root.  The interfaces can be removed by executing the command:
50
51    sh ifconfig.sh down
52
53... also as root.
54
55The servers use unprivileged ports (above 1024) instead of the usual port 53,
56so they can be run without root privileges once the interfaces have been set
57up.
58
59
60Note for MacOS Users
61---
62If you wish to make the interfaces survive across reboots, copy
63org.isc.bind.system and org.isc.bind.system.plist to /Library/LaunchDaemons
64then run
65
66    launchctl load /Library/LaunchDaemons/org.isc.bind.system.plist
67
68... as root.
69
70
71Running the System Tests
72===
73
74Running an Individual Test
75---
76The tests can be run individually using the following command:
77
78    sh run.sh [flags] <test-name> [<test-arguments>]
79
80e.g.
81
82    sh run.sh [flags] notify
83
84Optional flags are:
85
86    -k              Keep servers running after the test completes.  Each test
87                    usually starts a number of nameservers, either instances
88                    of the "named" being tested, or custom servers (written in
89                    Python or Perl) that feature test-specific behavior.  The
90                    servers are automatically started before the test is run
91                    and stopped after it ends.  This flag leaves them running
92                    at the end of the test, so that additional queries can be
93                    sent by hand.  To stop the servers afterwards, use the
94                    command "sh stop.sh <test-name>".
95
96    -n              Noclean - do not remove the output files if the test
97                    completes successfully.  By default, files created by the
98                    test are deleted if it passes;  they are not deleted if the
99                    test fails.
100
101    -p <number>     Sets the range of ports used by the test.  A block of 100
102                    ports is available for each test, the number given to the
103                    "-p" switch being the number of the start of that block
104                    (e.g. "-p 7900" will mean that the test is able to use
105                    ports 7900 through 7999).  If not specified, the test will
106                    have ports 5000 to 5099 available to it.
107
108Arguments are:
109
110    test-name       Mandatory.  The name of the test, which is the name of the
111                    subdirectory in bin/tests/system holding the test files.
112
113    test-arguments  Optional arguments that are passed to each of the test's
114                    scripts.
115
116
117Running All The System Tests
118---
119To run all the system tests, enter the command:
120
121    sh runall.sh [-c] [-n] [numproc]
122
123The optional flag "-c" forces colored output (by default system test output is
124not printed in color due to run.sh being piped through "tee").
125
126The optional flag "-n" has the same effect as it does for "run.sh" - it causes
127the retention of all output files from all tests.
128
129The optional "numproc" argument specifies the maximum number of tests that can
130run in parallel.  The default is 1, which means that all of the tests run
131sequentially.  If greater than 1, up to "numproc" tests will run simultaneously,
132new tests being started as tests finish.  Each test will get a unique set of
133ports, so there is no danger of tests interfering with one another.  Parallel
134running will reduce the total time taken to run the BIND system tests, but will
135mean that the output from all the tests sent to the screen will be mixed up
136with one another.  However, the systests.output file produced at the end of the
137run (in the bin/tests/system directory) will contain the output from each test
138in sequential order.
139
140Note that it is not possible to pass arguments to tests though the "runall.sh"
141script.
142
143A run of all the system tests can also be initiated via make:
144
145    make [-j numproc] test
146
147In this case, retention of the output files after a test completes successfully
148is specified by setting the environment variable SYSTEMTEST_NO_CLEAN to 1 prior
149to running make, e.g.
150
151    SYSTEMTEST_NO_CLEAN=1 make [-j numproc] test
152
153while setting environment variable SYSTEMTEST_FORCE_COLOR to 1 forces system
154test output to be printed in color.
155
156
157Running Multiple System Test Suites Simultaneously
158---
159In some cases it may be desirable to have multiple instances of the system test
160suite running simultaneously (e.g. from different terminal windows).  To do
161this:
162
1631. Each installation must have its own directory tree.  The system tests create
164files in the test directories, so separate directory trees are required to
165avoid interference between the same test running in the different
166installations.
167
1682. For one of the test suites, the starting port number must be specified by
169setting the environment variable STARTPORT before starting the test suite.
170Each test suite comprises about 100 tests, each being allocated a set of 100
171ports.  The port ranges for each test are allocated sequentially, so each test
172suite requires about 10,000 ports to itself.  By default, the port allocation
173starts at 5,000.  So the following set of commands:
174
175    Terminal Window 1:
176        cd <installation-1>/bin/tests/system
177        sh runall.sh 4
178
179    Terminal Window 2:
180        cd <installation-2>/bin/tests/system
181        STARTPORT=20000 sh runall.sh 4
182
183... will start the test suite for installation-1 using the default base port
184of 5,000, so the test suite will use ports 5,000 through 15,000 (or there
185abouts).  The use of "STARTPORT=20000" to prefix the run of the test suite for
186installation-2 will mean the test suite uses ports 20,000 through 30,000 or so.
187
188
189Format of Test Output
190---
191All output from the system tests is in the form of lines with the following
192structure:
193
194    <letter>:<test-name>:<message> [(<number>)]
195
196e.g.
197
198    I:catz:checking that dom1.example is not served by master (1)
199
200The meanings of the fields are as follows:
201
202<letter>
203This indicates the type of message.  This is one of:
204
205    S   Start of the test
206    A   Start of test (retained for backwards compatibility)
207    T   Start of test (retained for backwards compatibility)
208    E   End of the test
209    I   Information.  A test will typically output many of these messages
210        during its run, indicating test progress.  Note that such a message may
211        be of the form "I:testname:failed", indicating that a sub-test has
212        failed.
213    R   Result.  Each test will result in one such message, which is of the
214        form:
215
216                R:<test-name>:<result>
217
218        where <result> is one of:
219
220            PASS        The test passed
221            FAIL        The test failed
222            SKIPPED     The test was not run, usually because some
223                        prerequisites required to run the test are missing.
224
225<test-name>
226This is the name of the test from which the message emanated, which is also the
227name of the subdirectory holding the test files.
228
229<message>
230This is text output by the test during its execution.
231
232(<number>)
233If present, this will correlate with a file created by the test.  The tests
234execute commands and route the output of each command to a file.  The name of
235this file depends on the command and the test, but will usually be of the form:
236
237    <command>.out.<suffix><number>
238
239e.g. nsupdate.out.test28, dig.out.q3.  This aids diagnosis of problems by
240allowing the output that caused the problem message to be identified.
241
242
243Re-Running the Tests
244---
245If there is a requirement to re-run a test (or the entire test suite), the
246files produced by the tests should be deleted first.  Normally, these files are
247deleted if the test succeeds but are retained on error.  The run.sh script
248automatically calls a given test's clean.sh script before invoking its setup.sh
249script.
250
251Deletion of the files produced by the set of tests (e.g. after the execution
252of "runall.sh") can be carried out using the command:
253
254    sh cleanall.sh
255
256or
257
258    make testclean
259
260(Note that the Makefile has two other targets for cleaning up files: "clean"
261will delete all the files produced by the tests, as well as the object and
262executable files used by the tests.  "distclean" does all the work of "clean"
263as well as deleting configuration files produced by "configure".)
264
265
266Developer Notes
267===
268This section is intended for developers writing new tests.
269
270
271Overview
272---
273As noted above, each test is in a separate directory.  To interact with the
274test framework, the directories contain the following standard files:
275
276prereq.sh   Run at the beginning to determine whether the test can be run at
277            all; if not, we see a R:SKIPPED result.  This file is optional:
278            if not present, the test is assumed to have all its prerequisites
279            met.
280
281setup.sh    Run after prereq.sh, this sets up the preconditions for the tests.
282            Although optional, virtually all tests will require such a file to
283            set up the ports they should use for the test.
284
285tests.sh    Runs the actual tests.  This file is mandatory.
286
287clean.sh    Run at the end to clean up temporary files, but only if the test
288            was completed successfully and its running was not inhibited by the
289	    "-n" switch being passed to "run.sh".  Otherwise the temporary
290	    files are left in place for inspection.
291
292ns<N>       These subdirectories contain test name servers that can be queried
293	    or can interact with each other.  The value of N indicates the
294	    address the server listens on: for example, ns2 listens on
295	    10.53.0.2, and ns4 on 10.53.0.4.  All test servers use an
296	    unprivileged port, so they don't need to run as root.  These
297	    servers log at the highest debug level and the log is captured in
298	    the file "named.run".
299
300ans<N>      Like ns[X], but these are simple mock name servers implemented in
301            Perl or Python.  They are generally programmed to misbehave in ways
302            named would not so as to exercise named's ability to interoperate
303            with badly behaved name servers.
304
305
306Port Usage
307---
308In order for the tests to run in parallel, each test requires a unique set of
309ports.  These are specified by the "-p" option passed to "run.sh", which sets
310environment variables that the scripts listed above can reference.
311
312The convention used in the system tests is that the number passed is the start
313of a range of 100 ports.  The test is free to use the ports as required,
314although the first ten ports in the block are named and generally tests use the
315named ports for their intended purpose.  The names of the environment variables
316are:
317
318    PORT                     Number to be used for the query port.
319    CONTROLPORT              Number to be used as the RNDC control port.
320    EXTRAPORT1 - EXTRAPORT8  Eight port numbers that can be used as needed.
321
322Two other environment variables are defined:
323
324    LOWPORT                  The lowest port number in the range.
325    HIGHPORT                 The highest port number in the range.
326
327Since port ranges usually start on a boundary of 10, the variables are set such
328that the last digit of the port number corresponds to the number of the
329EXTRAPORTn variable.  For example, if the port range were to start at 5200, the
330port assignments would be:
331
332    PORT = 5200
333    EXTRAPORT1 = 5201
334        :
335    EXTRAPORT8 = 5208
336    CONTROLPORT = 5209
337    LOWPORT = 5200
338    HIGHPORT = 5299
339
340When running tests in parallel (i.e. giving a value of "numproc" greater than 1
341in the "make" or "runall.sh" commands listed above), it is guaranteed that each
342test will get a set of unique port numbers.
343
344
345Writing a Test
346---
347The test framework requires up to four shell scripts (listed above) as well as
348a number of nameserver instances to run.  Certain expectations are put on each
349script:
350
351
352General
353---
3541. Each of the four scripts will be invoked with the command
355
356    (cd <test-directory> ; sh <script> [<arguments>] )
357
358... so that working directory when the script starts executing is the test
359directory.
360
3612. Arguments can be only passed to the script if the test is being run as a
362one-off with "run.sh".  In this case, everything on the command line after the
363name of the test is passed to each script.  For example, the command:
364
365    sh run.sh -p 12300 mytest -D xyz
366
367... will run "mytest" with a port range of 12300 to 12399.  Each of the
368framework scripts provided by the test will be invoked using the remaining
369arguments, e.g.:
370
371   (cd mytest ; sh prereq.sh -D xyz)
372   (cd mytest ; sh setup.sh -D xyz)
373   (cd mytest ; sh tests.sh -D xyz)
374   (cd mytest ; sh clean.sh -D xyz)
375
376No arguments will be passed to the test scripts if the test is run as part of
377a run of the full test suite (e.g. the tests are started with "runall.sh").
378
3793.  Each script should start with the following lines:
380
381    SYSTEMTESTTOP=..
382    . $SYSTEMTESTTOP/conf.sh
383
384"conf.sh" defines a series of environment variables together with functions
385useful for the test scripts. (conf.sh.win32 is the Windows equivalent of this
386file.)
387
388
389prereq.sh
390---
391As noted above, this is optional.  If present, it should check whether specific
392software needed to run the test is available and/or whether BIND has been
393configured with the appropriate options required.
394
395    * If the software required to run the test is present and the BIND
396      configure options are correct, prereq.sh should return with a status code
397      of 0.
398
399    * If the software required to run the test is not available and/or BIND
400      has not been configured with the appropriate options, prereq.sh should
401      return with a status code of 1.
402
403    * If there is some other problem (e.g. prerequisite software is available
404      but is not properly configured), a status code of 255 should be returned.
405
406
407setup.sh
408---
409This is responsible for setting up the configuration files used in the test.
410
411To cope with the varying port number, ports are not hard-coded into
412configuration files (or, for that matter, scripts that emulate nameservers).
413Instead, setup.sh is responsible for editing the configuration files to set the
414port numbers.
415
416To do this, configuration files should be supplied in the form of templates
417containing tokens identifying ports.  The tokens have the same name as the
418environment variables listed above, but are prefixed and suffixed by the "@"
419symbol.  For example, a fragment of a configuration file template might look
420like:
421
422    controls {
423        inet 10.53.0.1 port @CONTROLPORT@ allow { any; } keys { rndc_key; };
424    };
425
426    options {
427        query-source address 10.53.0.1;
428        notify-source 10.53.0.1;
429        transfer-source 10.53.0.1;
430        port @PORT@;
431        allow-new-zones yes;
432    };
433
434setup.sh should copy the template to the desired filename using the
435"copy_setports" shell function defined in "conf.sh", i.e.
436
437    copy_setports ns1/named.conf.in ns1/named.conf
438
439This replaces the tokens @PORT@, @CONTROLPORT@, @EXTRAPORT1@ through
440@EXTRAPORT8@ with the contents of the environment variables listed above.
441setup.sh should do this for all configuration files required when the test
442starts.
443
444("setup.sh" should also use this method for replacing the tokens in any Perl or
445Python name servers used in the test.)
446
447
448tests.sh
449---
450This is the main test file and the contents depend on the test.  The contents
451are completely up to the developer, although most test scripts have a form
452similar to the following for each sub-test:
453
454    1. n=`expr $n + 1`
455    2. echo_i "prime cache nodata.example ($n)"
456    3. ret=0
457    4. $DIG -p ${PORT} @10.53.0.1 nodata.example TXT > dig.out.test$n
458    5. grep "status: NOERROR" dig.out.test$n > /dev/null || ret=1
459    6. grep "ANSWER: 0," dig.out.test$n > /dev/null || ret=1
460    7. if [ $ret != 0 ]; then echo_i "failed"; fi
461    8. status=`expr $status + $ret`
462
4631.  Increment the test number "n" (initialized to zero at the start of the
464    script).
465
4662.  Indicate that the sub-test is about to begin.  Note that "echo_i" instead
467    of "echo" is used.  echo_i is a function defined in "conf.sh" which will
468    prefix the message with "I:<testname>:", so allowing the output from each
469    test to be identified within the output.  The test number is included in
470    the message in order to tie the sub-test with its output.
471
4723. Initialize return status.
473
4744 - 6. Carry out the sub-test.  In this case, a nameserver is queried (note
475    that the port used is given by the PORT environment variable, which was set
476    by the inclusion of the file "conf.sh" at the start of the script).  The
477    output is routed to a file whose suffix includes the test number.  The
478    response from the server is examined and, in this case, if the required
479    string is not found, an error is indicated by setting "ret" to 1.
480
4817.  If the sub-test failed, a message is printed. "echo_i" is used to print
482    the message to add the prefix "I:<test-name>:" before it is output.
483
4848.  "status", used to track how many of the sub-tests have failed, is
485    incremented accordingly.  The value of "status" determines the status
486    returned by "tests.sh", which in turn determines whether the framework
487    prints the PASS or FAIL message.
488
489Regardless of this, rules that should be followed are:
490
491a.  Use the environment variables set by conf.sh to determine the ports to use
492    for sending and receiving queries.
493
494b.  Use a counter to tag messages and to associate the messages with the output
495    files.
496
497c.  Store all output produced by queries/commands into files.  These files
498    should be named according to the command that produced them, e.g. "dig"
499    output should be stored in a file "dig.out.<suffix>", the suffix being
500    related to the value of the counter.
501
502d.  Use "echo_i" to output informational messages.
503
504e.  Retain a count of test failures and return this as the exit status from
505    the script.
506
507
508clean.sh
509---
510The inverse of "setup.sh", this is invoked by the framework to clean up the
511test directory.  It should delete all files that have been created by the test
512during its run.
513
514
515Starting Nameservers
516---
517As noted earlier, a system test will involve a number of nameservers.  These
518will be either instances of named, or special servers written in a language
519such as Perl or Python.
520
521For the former, the version of "named" being run is that in the "bin/named"
522directory in the tree holding the tests (i.e. if "make test" is being run
523immediately after "make", the version of "named" used is that just built).  The
524configuration files, zone files etc. for these servers are located in
525subdirectories of the test directory named "nsN", where N is a small integer.
526The latter are special nameservers, mostly used for generating deliberately bad
527responses, located in subdirectories named "ansN" (again, N is an integer).
528In addition to configuration files, these directories should hold the
529appropriate script files as well.
530
531Note that the "N" for a particular test forms a single number space, e.g. if
532there is an "ns2" directory, there cannot be an "ans2" directory as well.
533Ideally, the directory numbers should start at 1 and work upwards.
534
535When running a test, the servers are started using "start.sh" (which is nothing
536more than a wrapper for start.pl).  The options for "start.pl" are documented
537in the header for that file, so will not be repeated here.  In summary, when
538invoked by "run.sh", start.pl looks for directories named "nsN" or "ansN" in
539the test directory and starts the servers it finds there.
540
541
542"named" Command-Line Options
543---
544By default, start.pl starts a "named" server with the following options:
545
546    -c named.conf   Specifies the configuration file to use (so by implication,
547                    each "nsN" nameserver's configuration file must be called
548                    named.conf).
549
550    -d 99           Sets the maximum debugging level.
551
552    -D <name>       The "-D" option sets a string used to identify the
553                    nameserver in a process listing.  In this case, the string
554                    is the name of the subdirectory.
555
556    -g              Runs the server in the foreground and logs everything to
557                    stderr.
558
559    -m record,size,mctx
560                    Turns on these memory usage debugging flags.
561
562    -U 4            Uses four listeners.
563
564    -X named.lock   Acquires a lock on this file in the "nsN" directory, so
565                    preventing multiple instances of this named running in this
566                    directory (which could possibly interfere with the test).
567
568All output is sent to a file called "named.run" in the nameserver directory.
569
570The options used to start named can be altered.  There are three ways of doing
571this.  "start.pl" checks the methods in a specific order: if a check succeeds,
572the options are set and any other specification is ignored.  In order, these
573are:
574
5751. Specifying options to "start.sh"/"start.pl" after the name of the test
576directory, e.g.
577
578    sh start.sh reclimit ns1 -- "-c n.conf -d 43"
579
580(This is only really useful when running tests interactively.)
581
5822. Including a file called "named.args" in the "nsN" directory.  If present,
583the contents of the first non-commented, non-blank line of the file are used as
584the named command-line arguments.  The rest of the file is ignored.
585
5863. Tweaking the default command line arguments with "-T" options.  This flag is
587used to alter the behavior of BIND for testing and is not documented in the
588ARM.  The presence of certain files in the "nsN" directory adds flags to
589the default command line (the content of the files is irrelevant - it
590is only the presence that counts):
591
592    named.noaa       Appends "-T noaa" to the command line, which causes
593                     "named" to never set the AA bit in an answer.
594
595    named.dropedns   Adds "-T dropedns" to the command line, which causes
596                     "named" to recognise EDNS options in messages, but drop
597                     messages containing them.
598
599    named.maxudp1460 Adds "-T maxudp1460" to the command line, setting the
600                     maximum UDP size handled by named to 1460.
601
602    named.maxudp512  Adds "-T maxudp512" to the command line, setting the
603                     maximum UDP size handled by named to 512.
604
605    named.noedns     Appends "-T noedns" to the command line, which disables
606                     recognition of EDNS options in messages.
607
608    named.notcp      Adds "-T notcp", which disables TCP in "named".
609
610    named.soa        Appends "-T nosoa" to the command line, which disables
611                     the addition of SOA records to negative responses (or to
612                     the additional section if the response is triggered by RPZ
613                     rewriting).
614
615Starting Other Nameservers
616---
617In contrast to "named", nameservers written in Perl or Python (whose script
618file should have the name "ans.pl" or "ans.py" respectively) are started with a
619fixed command line.  In essence, the server is given the address and nothing
620else.
621
622(This is not strictly true: Python servers are provided with the number of the
623query port to use.  Altering the port used by Perl servers currently requires
624creating a template file containing the "@PORT@" token, and having "setup.sh"
625substitute the actual port being used before the test starts.)
626
627
628Stopping Nameservers
629---
630As might be expected, the test system stops nameservers with the script
631"stop.sh", which is little more than a wrapper for "stop.pl".  Like "start.pl",
632the options available are listed in the file's header and will not be repeated
633here.
634
635In summary though, the nameservers for a given test, if left running by
636specifying the "-k" flag to "run.sh" when the test is started, can be stopped
637by the command:
638
639    sh stop.sh <test-name> [server]
640
641... where if the server (e.g. "ns1", "ans3") is not specified, all servers
642associated with the test are stopped.
643
644
645Adding a Test to the System Test Suite
646---
647Once a test has been created, the following files should be edited:
648
649* conf.sh.in  The name of the test should be added to the PARALLELDIRS or
650SEQUENTIALDIRS variables as appropriate.  The former is used for tests that
651can run in parallel with other tests, the latter for tests that are unable to
652do so.
653
654* conf.sh.win32 This is the Windows equivalent of conf.sh.in.  The name of the
655test should be added to the PARALLELDIRS or SEQUENTIALDIRS variables as
656appropriate.
657
658* Makefile.in The name of the test should be added to one of the the PARALLEL
659or SEQUENTIAL variables.
660
661(It is likely that a future iteration of the system test suite will remove the
662need to edit multiple files to add a test.)
663
664
665Valgrind
666---
667When running system tests, named can be run under Valgrind.  The output from
668Valgrind are sent to per-process files that can be reviewed after the test has
669completed.  To enable this, set the USE_VALGRIND environment variable to
670"helgrind" to run the Helgrind tool, or any other value to run the Memcheck
671tool.  To use "helgrind" effectively, build BIND with --disable-atomic.
672
673
674Maintenance Notes
675===
676This section is aimed at developers maintaining BIND's system test framework.
677
678Notes on Parallel Execution
679---
680Although execution of an individual test is controlled by "run.sh", which
681executes the above shell scripts (and starts the relevant servers) for each
682test, the running of all tests in the test suite is controlled by the Makefile.
683("runall.sh" does little more than invoke "make" on the Makefile.)
684
685All system tests are capable of being run in parallel.  For this to work, each
686test needs to use a unique set of ports.  To avoid the need to define which
687tests use which ports (and so risk port clashes as further tests are added),
688the ports are assigned when the tests are run.  This is achieved by having the
689"test" target in the Makefile depend on "parallel.mk".  That file is created
690when "make check" is run, and contains a target for each test of the form:
691
692    <test-name>:
693        @$(SHELL) run.sh -p <baseport> <test-name>
694
695The <baseport> is unique and the values of <baseport> for each test are
696separated by at least 100 ports.
697
698
699Cleaning Up From Tests
700---
701When a test is run, up to three different types of files are created:
702
7031. Files generated by the test itself, e.g. output from "dig" and "rndc", are
704stored in the test directory.
705
7062. Files produced by named which may not be cleaned up if named exits
707abnormally, e.g. core files, PID files etc., are stored in the test directory.
708
7093. A file "test.output.<test-name>" containing the text written to stdout by the
710test is written to bin/tests/system/.  This file is only produced when the test
711is run as part of the entire test suite (e.g. via "runall.sh").
712
713If the test fails, all these files are retained.  But if the test succeeds,
714they are cleaned up at different times:
715
7161. Files generated by the test itself are cleaned up by the test's own
717"clean.sh", which is called from "run.sh".
718
7192. Files that may not be cleaned up if named exits abnormally can be removed
720using the "cleanall.sh" script.
721
7223. "test.output.*" files are deleted when the test suite ends.  At this point,
723the file "testsummary.sh" is called which concatenates all the "test.output.*"
724files into a single "systests.output" file before deleting them.
725