1.. doctest-skip-all
2
3.. _testing-guidelines:
4
5******************
6Testing Guidelines
7******************
8
9This section describes the testing framework and format standards for tests in
10Astropy core and coordinated packages, and also serves as recommendations for
11affiliated packages.
12
13Testing Framework
14*****************
15
16The testing framework used by astropy (and packages using the :doc:`Astropy
17package template <astropy-package-template>`) is the `pytest`_ framework.
18
19.. _testing-dependencies:
20
21Testing Dependencies
22********************
23
24The dependencies used by the Astropy test runner are provided by a separate
25package called `pytest-astropy`_. This package provides the ``pytest``
26dependency itself, in addition to several ``pytest`` plugins that are used by
27Astropy, and will also be of general use to other packages.
28
29Since the testing dependencies are not actually required to install or use
30Astropy, they are not included in ``install_requires`` in ``setup.cfg``.
31Instead, they are listed in an ``extras_require`` section called ``test`` in
32``setup.cfg``. Developers who want to run the test suite will need to either
33install pytest-astropy directly::
34
35    pip install pytest-astropy
36
37or install the core package in 'editable' mode specifying the ``[test]``
38option::
39
40    pip install -e .[test]
41
42A detailed description of the plugins can be found in the :ref:`pytest-plugins`
43section.
44
45.. _running-tests:
46
47Running Tests
48*************
49
50There are currently three different ways to invoke Astropy tests. Each
51method invokes `pytest`_ to run the tests but offers different options when
52calling. To run the tests, you will need to make sure you have the `pytest`_
53package installed.
54
55In addition to running the Astropy tests, these methods can also be called
56so that they check Python source code for `PEP8 compliance
57<https://www.python.org/dev/peps/pep-0008/>`_. All of the PEP8 testing
58options require the `pytest-pep8 plugin
59<https://pypi.org/project/pytest-pep8>`_, which must be installed
60separately.
61
62
63tox
64===
65
66The most robust way to run the tests (which can also be the slowest) is
67to make use of `Tox <https://tox.readthedocs.io/en/latest/>`__, which is a
68general purpose tool for automating Python testing. One of the benefits of tox
69is that it first creates a source distribution of the package being tested, and
70installs it into a new virtual environment, along with any dependencies that are
71declared in the package, before running the tests. This can therefore catch
72issues related to undeclared package data, or missing dependencies. Since we use
73tox to run many of the tests on continuous integration services, it can also be
74used in many cases to reproduce issues seen on those services.
75
76To run the tests with tox, first make sure that tox is installed, e.g.::
77
78    pip install tox
79
80then run the basic test suite with::
81
82    tox -e test
83
84or run the test suite with all optional dependencies with::
85
86    tox -e test-alldeps
87
88You can see a list of available test environments with::
89
90    tox -l -v
91
92which will also explain what each of them does.
93
94You can also run checks or commands not directly related to tests - for instance::
95
96    tox -e codestyle
97
98will run checks using the flake8 tool.
99
100Is is possible to pass options to pytest when running tox - to do this, add a
101``--`` after the regular tox command, and anything after this will be passed to
102pytest, e.g.::
103
104    tox -e test -- -v --pdb
105
106This can be used in conjunction with the ``-P`` option provided by the
107`pytest-filter-subpackage <https://github.com/astropy/pytest-filter-subpackage>`_
108plugin to run just part of the test suite.
109
110.. _running-pytest:
111
112pytest
113======
114
115The test suite can also be run directly from the native ``pytest`` command,
116which is generally faster than using tox for iterative development. In
117this case, it is important for developers to be aware that they must manually
118rebuild any extensions by running::
119
120    pip install -e .[test]
121
122before running the test with pytest with::
123
124    pytest
125
126Instead of calling ``pip install -e .[test]``, you can also build the
127extensions with::
128
129    python setup.py build_ext --inplace
130
131which avoids also installing the developer version of astropy into your current
132environment - however note that the ``pip`` command is required if you need to
133test parts of the package that rely on certain `entry points
134<https://setuptools.readthedocs.io/en/latest/pkg_resources.html#entry-points>`_
135being installed.
136
137It is possible to run only the tests for a particular subpackage or set of
138subpackages.  For example, to run only the ``wcs`` tests from the
139commandline::
140
141    pytest -P wcs
142
143Or, to run only the ``wcs`` and ``utils`` tests::
144
145    pytest -P wcs,utils
146
147You can also specify a single directory or file to test from the commandline,
148e.g.::
149
150    pytest astropy/modeling
151
152or::
153
154    pytest astropy/wcs/tests/test_wcs.py
155
156and this works for ``.rst`` files too::
157
158    pytest astropy/wcs/index.rst
159
160.. _astropy.test():
161
162astropy.test()
163==============
164
165Tests can be run from an installed version of Astropy with::
166
167    import astropy
168    astropy.test()
169
170This will run all the default tests for Astropy (but will not run the
171documentation tests in the ``.rst`` documentation since those files are
172not installed).
173
174Tests for a specific package can be run by specifying the package in the call
175to the ``test()`` function::
176
177    astropy.test(package='io.fits')
178
179This method works only with package names that can be mapped to Astropy
180directories. As an alternative you can test a specific directory or file
181with the ``test_path`` option::
182
183  astropy.test(test_path='wcs/tests/test_wcs.py')
184
185The ``test_path`` must be specified either relative to the working directory
186or absolutely.
187
188By default `astropy.test()`_ will skip tests which retrieve data from the
189internet. To turn these tests on use the ``remote_data`` flag::
190
191    astropy.test(package='io.fits', remote_data=True)
192
193In addition, the ``test`` function supports any of the options that can be
194passed to :ref:`pytest.main() <pytest:pytest.main-usage>`
195and convenience options ``verbose=`` and ``pastebin=``.
196
197Enable PEP8 compliance testing with ``pep8=True`` in the call to
198``astropy.test``. This will enable PEP8 checking and disable regular tests.
199
200Astropy Test Function
201---------------------
202
203.. autofunction:: astropy.test
204
205Test-running options
206====================
207
208.. _open-files:
209
210Testing for open files
211----------------------
212
213Using the :ref:`openfiles-plugin` plugin (which is installed automatically
214when installing pytest-astropy),  we can test whether any of the unit tests
215inadvertently leave any files open.  Since this greatly slows down the time it
216takes to run the tests, it is turned off by default.
217
218To use it from the commandline, do::
219
220    pytest --open-files
221
222To use it from Python, do::
223
224    >>> import astropy
225    >>> astropy.test(open_files=True)
226
227For more information on the ``pytest-openfiles`` plugin see
228:ref:`openfiles-plugin`
229
230Test coverage reports
231---------------------
232
233Coverage reports can be generated using the `pytest-cov
234<https://pypi.org/project/pytest-cov/>`_ plugin (which is installed
235automatically when installing pytest-astropy) by using e.g.::
236
237    pytest --cov astropy --cov-report html
238
239There is some configuration inside the ``setup.cfg`` file that
240defines files to omit as well as lines to exclude.
241
242Running tests in parallel
243-------------------------
244
245It is possible to speed up astropy's tests using the `pytest-xdist
246<https://pypi.org/project/pytest-xdist>`_ plugin.
247
248Once installed, tests can be run in parallel using the ``'-n'``
249commandline option. For example, to use 4 processes::
250
251    pytest -n 4
252
253Pass ``-n auto`` to create the same number of processes as cores
254on your machine.
255
256Similarly, this feature can be invoked from ``astropy.test``::
257
258    >>> import astropy
259    >>> astropy.test(parallel=4)
260
261.. _writing-tests:
262
263Writing tests
264*************
265
266``pytest`` has the following test discovery rules:
267
268 * ``test_*.py`` or ``*_test.py`` files
269 * ``Test`` prefixed classes (without an ``__init__`` method)
270 * ``test_`` prefixed functions and methods
271
272Consult the :ref:`test discovery rules <pytest:python test discovery>`
273for detailed information on how to name files and tests so that they are
274automatically discovered by `pytest`_.
275
276Simple example
277==============
278
279The following example shows a simple function and a test to test this
280function::
281
282    def func(x):
283        """Add one to the argument."""
284        return x + 1
285
286    def test_answer():
287        """Check the return value of func() for an example argument."""
288        assert func(3) == 5
289
290If we place this in a ``test.py`` file and then run::
291
292    pytest test.py
293
294The result is::
295
296    ============================= test session starts ==============================
297    python: platform darwin -- Python 3.x.x -- pytest-x.x.x
298    test object 1: /Users/username/tmp/test.py
299
300    test.py F
301
302    =================================== FAILURES ===================================
303    _________________________________ test_answer __________________________________
304
305        def test_answer():
306    >       assert func(3) == 5
307    E       assert 4 == 5
308    E        +  where 4 = func(3)
309
310    test.py:5: AssertionError
311    =========================== 1 failed in 0.07 seconds ===========================
312
313Where to put tests
314==================
315
316Package-specific tests
317----------------------
318
319Each package should include a suite of unit tests, covering as many of
320the public methods/functions as possible. These tests should be
321included inside each sub-package, e.g::
322
323    astropy/io/fits/tests/
324
325``tests`` directories should contain an ``__init__.py`` file so that
326the tests can be imported and so that they can use relative imports.
327
328Interoperability tests
329----------------------
330
331Tests involving two or more sub-packages should be included in::
332
333    astropy/tests/
334
335Regression tests
336================
337
338Any time a bug is fixed, and wherever possible, one or more regression tests
339should be added to ensure that the bug is not introduced in future. Regression
340tests should include the ticket URL where the bug was reported.
341
342.. _data-files:
343
344Working with data files
345=======================
346
347Tests that need to make use of a data file should use the
348`~astropy.utils.data.get_pkg_data_fileobj` or
349`~astropy.utils.data.get_pkg_data_filename` functions.  These functions
350search locally first, and then on the astropy data server or an arbitrary
351URL, and return a file-like object or a local filename, respectively.  They
352automatically cache the data locally if remote data is obtained, and from
353then on the local copy will be used transparently.  See the next section for
354note specific to dealing with the cache in tests.
355
356They also support the use of an MD5 hash to get a specific version of a data
357file.  This hash can be obtained prior to submitting a file to the astropy
358data server by using the `~astropy.utils.data.compute_hash` function on a
359local copy of the file.
360
361Tests that may retrieve remote data should be marked with the
362``@pytest.mark.remote_data`` decorator, or, if a doctest, flagged with the
363``REMOTE_DATA`` flag.  Tests marked in this way will be skipped by default by
364``astropy.test()`` to prevent test runs from taking too long. These tests can
365be run by ``astropy.test()`` by adding the ``remote_data='any'`` flag.  Turn on
366the remote data tests at the command line with ``pytest --remote-data=any``.
367
368It is possible to mark tests using
369``@pytest.mark.remote_data(source='astropy')``, which can be used to indicate
370that the only required data is from the http://data.astropy.org server. To
371enable just these tests, you can run the
372tests with ``pytest --remote-data=astropy``.
373
374For more information on the ``pytest-remotedata`` plugin, see
375:ref:`remotedata-plugin`.
376
377Examples
378--------
379.. code-block:: python
380
381    from ...config import get_data_filename
382
383    def test_1():
384        """Test version using a local file."""
385        #if filename.fits is a local file in the source distribution
386        datafile = get_data_filename('filename.fits')
387        # do the test
388
389    @pytest.mark.remote_data
390    def test_2():
391        """Test version using a remote file."""
392        #this is the hash for a particular version of a file stored on the
393        #astropy data server.
394        datafile = get_data_filename('hash/94935ac31d585f68041c08f87d1a19d4')
395        # do the test
396
397    def doctest_example():
398        """
399        >>> datafile = get_data_filename('hash/94935')  # doctest: +REMOTE_DATA
400        """
401        pass
402
403The ``get_remote_test_data`` will place the files in a temporary directory
404indicated by the ``tempfile`` module, so that the test files will eventually
405get removed by the system. In the long term, once test data files become too
406large, we will need to design a mechanism for removing test data immediately.
407
408Tests that use the file cache
409-----------------------------
410
411By default, the Astropy test runner sets up a clean file cache in a temporary
412directory that is used only for that test run and then destroyed.  This is to
413ensure consistency between test runs, as well as to not clutter users' caches
414(i.e. the cache directory returned by `~astropy.config.get_cache_dir`) with
415test files.
416
417However, some test authors (especially for affiliated packages) may find it
418desirable to cache files downloaded during a test run in a more permanent
419location (e.g. for large data sets).  To this end the
420`~astropy.config.set_temp_cache` helper may be used.  It can be used either as
421a context manager within a test to temporarily set the cache to a custom
422location, or as a *decorator* that takes effect for an entire test function
423(not including setup or teardown, which would have to be decorated separately).
424
425Furthermore, it is possible to change the location of the cache directory
426for the duration of the test run by setting the ``XDG_CACHE_HOME``
427environment variable.
428
429
430Tests that create files
431=======================
432
433Tests may often be run from directories where users do not have write
434permissions so tests which create files should always do so in
435temporary directories. This can be done with the
436:ref:`pytest 'tmpdir' fixture <pytest:tmpdir handling>` or with
437Python's built-in :ref:`tempfile module <python:tempfile-examples>`.
438
439
440Setting up/Tearing down tests
441=============================
442
443In some cases, it can be useful to run a series of tests requiring something
444to be set up first. There are four ways to do this:
445
446Module-level setup/teardown
447---------------------------
448
449If the ``setup_module`` and ``teardown_module`` functions are specified in a
450file, they are called before and after all the tests in the file respectively.
451These functions take one argument, which is the module itself, which makes it
452very easy to set module-wide variables::
453
454    def setup_module(module):
455        """Initialize the value of NUM."""
456        module.NUM = 11
457
458    def add_num(x):
459        """Add pre-defined NUM to the argument."""
460        return x + NUM
461
462    def test_42():
463        """Ensure that add_num() adds the correct NUM to its argument."""
464        added = add_num(42)
465        assert added == 53
466
467We can use this for example to download a remote test data file and have all
468the functions in the file access it::
469
470    import os
471
472    def setup_module(module):
473        """Store a copy of the remote test file."""
474        module.DATAFILE = get_remote_test_data('94935ac31d585f68041c08f87d1a19d4')
475
476    def test():
477        """Perform test using cached remote input file."""
478        f = open(DATAFILE, 'rb')
479        # do the test
480
481    def teardown_module(module):
482        """Clean up remote test file copy."""
483        os.remove(DATAFILE)
484
485Class-level setup/teardown
486--------------------------
487
488Tests can be organized into classes that have their own setup/teardown
489functions. In the following ::
490
491    def add_nums(x, y):
492        """Add two numbers."""
493        return x + y
494
495    class TestAdd42(object):
496        """Test for add_nums with y=42."""
497
498        def setup_class(self):
499            self.NUM = 42
500
501        def test_1(self):
502            """Test behavior for a specific input value."""
503            added = add_nums(11, self.NUM)
504            assert added == 53
505
506        def test_2(self):
507            """Test behavior for another input value."""
508            added = add_nums(13, self.NUM)
509            assert added == 55
510
511        def teardown_class(self):
512            pass
513
514In the above example, the ``setup_class`` method is called first, then all the
515tests in the class, and finally the ``teardown_class`` is called.
516
517Method-level setup/teardown
518---------------------------
519
520There are cases where one might want setup and teardown methods to be run
521before and after *each* test. For this, use the ``setup_method`` and
522``teardown_method`` methods::
523
524    def add_nums(x, y):
525        """Add two numbers."""
526        return x + y
527
528    class TestAdd42(object):
529        """Test for add_nums with y=42."""
530
531        def setup_method(self, method):
532            self.NUM = 42
533
534        def test_1(self):
535        """Test behavior for a specific input value."""
536            added = add_nums(11, self.NUM)
537            assert added == 53
538
539        def test_2(self):
540        """Test behavior for another input value."""
541            added = add_nums(13, self.NUM)
542            assert added == 55
543
544        def teardown_method(self, method):
545            pass
546
547Function-level setup/teardown
548-----------------------------
549
550Finally, one can use ``setup_function`` and ``teardown_function`` to define a
551setup/teardown mechanism to be run before and after each function in a module.
552These take one argument, which is the function being tested::
553
554    def setup_function(function):
555        pass
556
557    def test_1(self):
558       """First test."""
559        # do test
560
561    def test_2(self):
562        """Second test."""
563        # do test
564
565    def teardown_function(function):
566        pass
567
568Property-based tests
569====================
570
571`Property-based testing
572<https://increment.com/testing/in-praise-of-property-based-testing/>`_
573lets you focus on the parts of your test that matter, by making more
574general claims - "works for any two numbers" instead of "works for 1 + 2".
575Imagine if random testing gave you minimal, non-flaky failing examples,
576and a clean way to describe even the most complicated data - that's
577property-based testing!
578
579``pytest-astropy`` includes a dependency on `Hypothesis
580<https://hypothesis.readthedocs.io/>`_, so installation is easy -
581you can just read the docs or `work through the tutorial
582<https://github.com/Zac-HD/escape-from-automanual-testing/>`_
583and start writing tests like::
584
585    from astropy.coordinates import SkyCoord
586    from hypothesis import given, strategies as st
587
588    @given(
589        st.builds(SkyCoord, ra=st.floats(0, 360), dec=st.floats(-90, 90))
590    )
591    def test_coordinate_transform(coord):
592        """Test that sky coord can be translated from ICRS to Galactic and back."""
593        assert coord == coord.galactic.icrs  # floating-point precision alert!
594
595Other properties that you could test include:
596
597- Round-tripping from image to sky coordinates and back should be lossless
598  for distortion-free mappings, and otherwise always below 10^-5 px.
599- Take a moment in time, round-trip it through various frames, and check it
600  hasn't changed or lost precision. (or at least not by more than a nanosecond)
601- IO routines losslessly round-trip data that they are expected to handle
602- Optimised routines calculate the same result as unoptimised, within tolerances
603
604This is a great way to start contributing to Astropy, and has already found
605bugs in time handling.  See issue #9017 and pull request #9532 for details!
606
607(and if you find Hypothesis useful in your research,
608`please cite it <https://doi.org/10.21105/joss.01891>`_!)
609
610
611Parametrizing tests
612===================
613
614If you want to run a test several times for slightly different values,
615you can use ``pytest`` to avoid writing separate tests.
616For example, instead of writing::
617
618    def test1():
619        assert type('a') == str
620
621    def test2():
622        assert type('b') == str
623
624    def test3():
625        assert type('c') == str
626
627You can use the ``@pytest.mark.parametrize`` decorator to concisely
628create a test function for each input::
629
630    @pytest.mark.parametrize(('letter'), ['a', 'b', 'c'])
631    def test(letter):
632        """Check that the input is a string."""
633        assert type(letter) == str
634
635As a guideline, use ``parametrize`` if you can enumerate all possible
636test cases and each failure would be a distinct issue, and Hypothesis
637when there are many possible inputs or you only want a single simple
638failure to be reported.
639
640Tests requiring optional dependencies
641=====================================
642
643For tests that test functions or methods that require optional dependencies
644(e.g., Scipy), pytest should be instructed to skip the test if the dependencies
645are not present, as the ``astropy`` tests should succeed even if an optional
646dependency is not present. ``astropy`` provides a list of boolean flags that
647test whether optional dependencies are installed (at import time). For example,
648to load the corresponding flag for Scipy and mark a test to skip if Scipy is not
649present, use::
650
651    import pytest
652    from astropy.utils.compat.optional_deps import HAS_SCIPY
653
654    @pytest.mark.skipif(not HAS_SCIPY, reason='scipy is required')
655    def test_that_uses_scipy():
656        ...
657
658These variables should exist for all of Astropy's optional dependencies; a
659complete list of supported flags can be found in
660``astropy.utils.compat.optional_deps``.
661
662Any new optional dependencies should be added to that file, as well as to
663relevant entries in ``setup.cfg`` under ``options.extras_require``:
664typically, under ``all`` for dependencies used in user-facing code
665(e.g., ``h5py``, which is used to write tables to HDF5 format),
666and in ``test_all`` for dependencies only used in tests (e.g.,
667``skyfield``, which is used to cross-check the accuracy of coordinate
668transforms).
669
670Using pytest helper functions
671=============================
672
673If your tests need to use `pytest helper functions
674<https://docs.pytest.org/en/latest/reference/reference.html#functions>`_, such as
675``pytest.raises``, import ``pytest`` into your test module like so::
676
677    import pytest
678
679Testing warnings
680================
681
682In order to test that warnings are triggered as expected in certain
683situations,
684`pytest`_ provides its own context manager
685:ref:`pytest.warns <pytest:warns>` that, completely
686analogously to ``pytest.raises`` (see below) allows to probe explicitly
687for specific warning classes and, through the optional ``match`` argument,
688messages. Note that when no warning of the specified type is
689triggered, this will make the test fail. When checking for optional,
690but not mandatory warnings, ``pytest.warns(None)`` can be used to catch and
691inspect them.
692
693.. note::
694
695   With `pytest`_ there is also the option of using the
696   :ref:`recwarn <pytest:recwarn>` function argument to test that
697   warnings are triggered within the entire embedding function.
698   This method has been found to be problematic in at least one case
699   (`pull request 1174 <https://github.com/astropy/astropy/pull/1174#issuecomment-20249309>`_).
700
701Testing exceptions
702==================
703
704Just like the handling of warnings described above, tests that are
705designed to trigger certain errors should verify that an exception of
706the expected type is raised in the expected place.  This is efficiently
707done by running the tested code inside the
708:ref:`pytest.raises <pytest:assertraises>`
709context manager.  Its optional ``match`` argument allows to check the
710error message for any patterns using ``regex`` syntax.  For example the
711matches ``pytest.raises(OSError, match=r'^No such file')`` and
712``pytest.raises(OSError, match=r'or directory$')`` would be equivalent
713to ``assert str(err).startswith(No such file)`` and ``assert
714str(err).endswith(or directory)``, respectively, on the raised error
715message ``err``.
716For matching multi-line messages you need to pass the ``(?s)``
717:ref:`flag <python:re-syntax>`
718to the underlying ``re.search``, as in the example below::
719
720  with pytest.raises(fits.VerifyError, match=r'(?s)not upper.+ Illegal key') as excinfo:
721      hdu.verify('fix+exception')
722  assert str(excinfo.value).count('Card') == 2
723
724This invocation also illustrates how to get an ``ExceptionInfo`` object
725returned to perform additional diagnostics on the info.
726
727Testing configuration parameters
728================================
729
730In order to ensure reproducibility of tests, all configuration items
731are reset to their default values when the test runner starts up.
732
733Sometimes you'll want to test the behavior of code when a certain
734configuration item is set to a particular value.  In that case, you
735can use the `astropy.config.ConfigItem.set_temp` context manager to
736temporarily set a configuration item to that value, test within that
737context, and have it automatically return to its original value.
738
739For example::
740
741    def test_pprint():
742        from ... import conf
743        with conf.set_temp('max_lines', 6):
744            # ...
745
746Marking blocks of code to exclude from coverage
747===============================================
748
749Blocks of code may be ignored by the coverage testing by adding a
750comment containing the phrase ``pragma: no cover`` to the start of the
751block::
752
753    if this_rarely_happens:  # pragma: no cover
754        this_call_is_ignored()
755
756.. _image-tests:
757
758Image tests with pytest-mpl
759===========================
760
761Running image tests
762-------------------
763
764We make use of the `pytest-mpl <https://pypi.org/project/pytest-mpl>`_
765plugin to write tests where we can compare the output of plotting commands
766with reference files on a pixel-by-pixel basis (this is used for instance in
767:ref:`astropy.visualization.wcsaxes <wcsaxes>`).
768
769To run the Astropy tests with the image comparison, use::
770
771    pytest --mpl --remote-data=astropy
772
773However, note that the output can be very sensitive to the version of Matplotlib
774as well as all its dependencies (e.g., freetype), so we recommend running the
775image tests inside a `Docker <https://www.docker.com/>`__ container which has a
776frozen set of package versions (Docker containers can be thought of as mini
777virtual machines). See our ``.circleci/config.yml`` for reference.
778
779Writing image tests
780-------------------
781
782The `README.rst <https://github.com/matplotlib/pytest-mpl/blob/master/README.rst>`__
783for the plugin contains information on writing tests with this plugin. The only
784key addition compared to those instructions is that you should set
785``baseline_dir``::
786
787    from astropy.tests.image_tests import IMAGE_REFERENCE_DIR
788
789    @pytest.mark.mpl_image_compare(baseline_dir=IMAGE_REFERENCE_DIR)
790
791This is because since the reference image files would contribute significantly
792to the repository size, we instead store them on the http://data.astropy.org
793site. The downside is that it is a little more complicated to create or
794re-generate reference files, but we describe the process here.
795
796Generating reference images
797---------------------------
798
799Any failed test on CircleCI would provide you with the old and the new reference
800images, along with the difference image. After you have determined that the
801new reference image is acceptable, you could download it from the "artifacts"
802tab on the CircleCI dashboard.
803
804Uploading the reference images
805------------------------------
806
807Next, we need to add these images to the http://data.astropy.org server. To do
808this, open a pull request to `this <https://github.com/astropy/astropy-data>`_
809repository. The reference images for Astropy tests should go inside the
810`testing/astropy <https://github.com/astropy/astropy-data/tree/gh-pages/testing/astropy>`_
811directory. In that directory are folders named as timestamps. If you are simply
812adding new tests, add the reference files to the most recent directory.
813
814If you are re-generating baseline images due to changes in Astropy, make a new
815timestamp directory by copying one the most recent one, then replace any
816baseline images that have changed. Note that due to changes between Matplotlib
817versions, we need to add the whole set of reference images for each major
818Matplotlib version. Therefore, in each timestamp folder, there are folders named
819e.g. ``1.4.x`` and ``1.5.x``.
820
821Once the reference images are merged in and available on
822http://data.astropy.org, update the timestamp in the ``IMAGE_REFERENCE_DIR``
823variable in the ``astropy.tests.image_tests`` sub-module. Because the timestamp
824is hard-coded, adding a new timestamp directory will not mess with testing for
825released versions of Astropy, so you can easily add and tweak a new timestamp
826directory while still working on a pull request to Astropy.
827
828.. _doctests:
829
830Writing doctests
831****************
832
833A doctest in Python is a special kind of test that is embedded in a
834function, class, or module's docstring, or in the narrative Sphinx
835documentation, and is formatted to look like a Python interactive
836session--that is, they show lines of Python code entered at a ``>>>``
837prompt followed by the output that would be expected (if any) when
838running that code in an interactive session.
839
840The idea is to write usage examples in docstrings that users can enter
841verbatim and check their output against the expected output to confirm that
842they are using the interface properly.
843
844Furthermore, Python includes a :mod:`doctest` module that can detect these
845doctests and execute them as part of a project's automated test suite.  This
846way we can automatically ensure that all doctest-like examples in our
847docstrings are correct.
848
849The Astropy test suite automatically detects and runs any doctests in the
850astropy source code or documentation, or in packages using the Astropy test
851running framework. For example doctests and detailed documentation on how to
852write them, see the full :mod:`doctest` documentation.
853
854.. note::
855
856   Since the narrative Sphinx documentation is not installed alongside the
857   astropy source code, it can only be tested by running ``pytest`` directly (or
858   via tox), not by ``import astropy; astropy.test()``.
859
860For more information on the ``pytest-doctestplus`` plugin used by Astropy, see
861:ref:`doctestplus-plugin`.
862
863.. _skipping-doctests:
864
865Skipping doctests
866=================
867
868Sometimes it is necessary to write examples that look like doctests but that
869are not actually executable verbatim. An example may depend on some external
870conditions being fulfilled, for example. In these cases there are a few ways to
871skip a doctest:
872
8731. Next to the example add a comment like: ``# doctest: +SKIP``.  For example:
874
875   .. code-block:: none
876
877     >>> import os
878     >>> os.listdir('.')  # doctest: +SKIP
879
880   In the above example we want to direct the user to run ``os.listdir('.')``
881   but we don't want that line to be executed as part of the doctest.
882
883   To skip tests that require fetching remote data, use the ``REMOTE_DATA``
884   flag instead.  This way they can be turned on using the
885   ``--remote-data`` flag when running the tests:
886
887   .. code-block:: none
888
889     >>> datafile = get_data_filename('hash/94935')  # doctest: +REMOTE_DATA
890
8912. Astropy's test framework adds support for a special ``__doctest_skip__``
892   variable that can be placed at the module level of any module to list
893   functions, classes, and methods in that module whose doctests should not
894   be run.  That is, if it doesn't make sense to run a function's example
895   usage as a doctest, the entire function can be skipped in the doctest
896   collection phase.
897
898   The value of ``__doctest_skip__`` should be a list of wildcard patterns
899   for all functions/classes whose doctests should be skipped.  For example::
900
901       __doctest_skip__ = ['myfunction', 'MyClass', 'MyClass.*']
902
903   skips the doctests in a function called ``myfunction``, the doctest for a
904   class called ``MyClass``, and all *methods* of ``MyClass``.
905
906   Module docstrings may contain doctests as well.  To skip the module-level
907   doctests include the string ``'.'`` in ``__doctest_skip__``.
908
909   To skip all doctests in a module::
910
911       __doctest_skip__ = ['*']
912
9133. In the Sphinx documentation, a doctest section can be skipped by
914   making it part of a ``doctest-skip`` directive::
915
916       .. doctest-skip::
917
918           >>> # This is a doctest that will appear in the documentation,
919           >>> # but will not be executed by the testing framework.
920           >>> 1 / 0  # Divide by zero, ouch!
921
922   It is also possible to skip all doctests below a certain line using
923   a ``doctest-skip-all`` comment.  Note the lack of ``::`` at the end
924   of the line here::
925
926       .. doctest-skip-all
927
928       All doctests below here are skipped...
929
9304. ``__doctest_requires__`` is a way to list dependencies for specific
931   doctests.  It should be a dictionary mapping wildcard patterns (in the same
932   format as ``__doctest_skip__``) to a list of one or more modules that should
933   be *importable* in order for the tests to run.  For example, if some tests
934   require the scipy module to work they will be skipped unless ``import
935   scipy`` is possible.  It is also possible to use a tuple of wildcard
936   patterns as a key in this dict::
937
938            __doctest_requires__ = {('func1', 'func2'): ['scipy']}
939
940   Having this module-level variable will require ``scipy`` to be importable
941   in order to run the doctests for functions ``func1`` and ``func2`` in that
942   module.
943
944   In the Sphinx documentation, a doctest requirement can be notated with the
945   ``doctest-requires`` directive::
946
947       .. doctest-requires:: scipy
948
949           >>> import scipy
950           >>> scipy.hamming(...)
951
952
953Skipping output
954===============
955
956One of the important aspects of writing doctests is that the example output
957can be accurately compared to the actual output produced when running the
958test.
959
960The doctest system compares the actual output to the example output verbatim
961by default, but this not always feasible.  For example the example output may
962contain the ``__repr__`` of an object which displays its id (which will change
963on each run), or a test that expects an exception may output a traceback.
964
965The simplest way to generalize the example output is to use the ellipses
966``...``.  For example::
967
968    >>> 1 / 0
969    Traceback (most recent call last):
970    ...
971    ZeroDivisionError: integer division or modulo by zero
972
973This doctest expects an exception with a traceback, but the text of the
974traceback is skipped in the example output--only the first and last lines
975of the output are checked.  See the :mod:`doctest` documentation for
976more examples of skipping output.
977
978Ignoring all output
979-------------------
980
981Another possibility for ignoring output is to use the
982``# doctest: +IGNORE_OUTPUT`` flag.  This allows a doctest to execute (and
983check that the code executes without errors), but allows the entire output
984to be ignored in cases where we don't care what the output is.  This differs
985from using ellipses in that we can still provide complete example output, just
986without the test checking that it is exactly right.  For example::
987
988    >>> print('Hello world')  # doctest: +IGNORE_OUTPUT
989    We don't really care what the output is as long as there were no errors...
990
991.. _handling-float-output:
992
993Handling float output
994=====================
995
996Some doctests may produce output that contains string representations of
997floating point values.  Floating point representations are often not exact and
998contain roundoffs in their least significant digits.  Depending on the platform
999the tests are being run on (different Python versions, different OS, etc.) the
1000exact number of digits shown can differ.  Because doctests work by comparing
1001strings this can cause such tests to fail.
1002
1003To address this issue, the ``pytest-doctestplus`` plugin provides support for a
1004``FLOAT_CMP`` flag that can be used with doctests.  For example:
1005
1006.. code-block:: none
1007
1008  >>> 1.0 / 3.0  # doctest: +FLOAT_CMP
1009  0.333333333333333311
1010
1011When this flag is used, the expected and actual outputs are both parsed to find
1012any floating point values in the strings.  Those are then converted to actual
1013Python `float` objects and compared numerically.  This means that small
1014differences in representation of roundoff digits will be ignored by the
1015doctest.  The values are otherwise compared exactly, so more significant
1016(albeit possibly small) differences will still be caught by these tests.
1017
1018Continuous integration
1019**********************
1020
1021Overview
1022========
1023
1024Astropy uses the following continuous integration (CI) services:
1025
1026* `GitHub Actions <https://github.com/astropy/astropy/actions>`_ for
1027  Linux, OS X, and Windows setups
1028  (Note: GitHub Actions does not have "allowed failures" yet, so you might
1029  see a fail job reported for your PR with "(Allowed Failure)" in its name.
1030  Still, some failures might be real and related to your changes, so check
1031  it anyway!)
1032* `CircleCI <https://circleci.com>`_ for visualization tests
1033
1034These continuously test the package for each commit and pull request that is
1035pushed to GitHub to notice when something breaks.
1036
1037In some cases, you may see failures on continuous integration services that
1038you do not see locally, for example because the operating system is different,
1039or because the failure happens with only 32-bit Python.
1040
1041.. _pytest-plugins:
1042
1043Pytest Plugins
1044**************
1045
1046The following ``pytest`` plugins are maintained and used by Astropy. They are
1047included as dependencies to the ``pytest-astropy`` package, which is now
1048required for testing Astropy. More information on all of the  plugins provided
1049by the ``pytest-astropy`` package (including dependencies not maintained by
1050Astropy) can be found `here <https://github.com/astropy/pytest-astropy>`__.
1051
1052.. _remotedata-plugin:
1053
1054pytest-remotedata
1055=================
1056
1057The `pytest-remotedata`_ plugin allows developers to control whether to run
1058tests that access data from the internet. The plugin provides two decorators
1059that can be used to mark individual test functions or entire test classes:
1060
1061* ``@pytest.mark.remote_data`` for tests that require data from the internet
1062* ``@pytest.mark.internet_off`` for tests that should run only when there is no
1063  internet access. This is useful for testing local data caches or fallbacks
1064  for when no network access is available.
1065
1066The plugin also adds the ``--remote-data`` option to the ``pytest`` command
1067(which is also made available through the Astropy test runner).
1068
1069If the ``--remote-data`` option is not provided when running the test suite, or
1070if ``--remote-data=none`` is provided, all tests that are marked with
1071``remote_data`` will be skipped. All tests that are marked with
1072``internet_off`` will be executed. Any test that attempts to access the
1073internet but is not marked with ``remote_data`` will result in a failure.
1074
1075Providing either the ``--remote-data`` option, or ``--remote-data=any``, will
1076cause all tests marked with ``remote_data`` to be executed. Any tests that are
1077marked with ``internet_off`` will be skipped.
1078
1079Running the tests with ``--remote-data=astropy`` will cause only tests that
1080receive remote data from Astropy data sources to be run. Tests with any other
1081data sources will be skipped. This is indicated in the test code by marking
1082test functions with ``@pytest.mark.remote_data(source='astropy')``. Tests
1083marked with ``internet_off`` will also be skipped in this case.
1084
1085Also see :ref:`data-files`.
1086
1087.. _doctestplus-plugin:
1088
1089pytest-doctestplus
1090==================
1091
1092The `pytest-doctestplus`_ plugin provides advanced doctest features, including:
1093
1094* handling doctests that use remote data in conjunction with the
1095  ``pytest-remotedata`` plugin above (see :ref:`data-files`)
1096* approximate floating point comparison for doctests that produce floating
1097  point results (see :ref:`handling-float-output`)
1098* skipping particular classes, methods, and functions when running doctests
1099  (see :ref:`skipping-doctests`)
1100* optional inclusion of ``*.rst`` files for doctests
1101
1102This plugin provides two command line options: ``--doctest-plus`` for enabling
1103the advanced features mentioned above, and ``--doctest-rst`` for including
1104``*.rst`` files in doctest collection.
1105
1106The Astropy test runner enables both of these options by default. When running
1107the test suite directly from ``pytest`` (instead of through the Astropy test
1108runner), it is necessary to explicitly provide these options when they are
1109needed.
1110
1111.. _openfiles-plugin:
1112
1113pytest-openfiles
1114================
1115
1116The `pytest-openfiles`_ plugin allows for the detection of open I/O resources
1117at the end of unit tests. This plugin adds the ``--open-files`` option to the
1118``pytest`` command (which is also exposed through the Astropy test runner).
1119
1120When running tests with ``--open-files``, if a file is opened during the course
1121of a unit test but that file  not closed before the test finishes, the test
1122will fail. This is particularly useful for testing code that manipulates file
1123handles or other I/O resources. It allows developers to ensure that this kind
1124of code properly cleans up I/O resources when they are no longer needed.
1125
1126Also see :ref:`open-files`.
1127