1
2The writing and reporting of assertions in tests
3==================================================
4
5.. _`assertfeedback`:
6.. _`assert with the assert statement`:
7.. _`assert`:
8
9
10Asserting with the ``assert`` statement
11---------------------------------------------------------
12
13``pytest`` allows you to use the standard python ``assert`` for verifying
14expectations and values in Python tests.  For example, you can write the
15following::
16
17    # content of test_assert1.py
18    def f():
19        return 3
20
21    def test_function():
22        assert f() == 4
23
24to assert that your function returns a certain value. If this assertion fails
25you will see the return value of the function call::
26
27    $ pytest test_assert1.py
28    =========================== test session starts ============================
29    platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
30    rootdir: $REGENDOC_TMPDIR, inifile:
31    collected 1 item
32
33    test_assert1.py F                                                    [100%]
34
35    ================================= FAILURES =================================
36    ______________________________ test_function _______________________________
37
38        def test_function():
39    >       assert f() == 4
40    E       assert 3 == 4
41    E        +  where 3 = f()
42
43    test_assert1.py:5: AssertionError
44    ========================= 1 failed in 0.12 seconds =========================
45
46``pytest`` has support for showing the values of the most common subexpressions
47including calls, attributes, comparisons, and binary and unary
48operators. (See :ref:`tbreportdemo`).  This allows you to use the
49idiomatic python constructs without boilerplate code while not losing
50introspection information.
51
52However, if you specify a message with the assertion like this::
53
54    assert a % 2 == 0, "value was odd, should be even"
55
56then no assertion introspection takes places at all and the message
57will be simply shown in the traceback.
58
59See :ref:`assert-details` for more information on assertion introspection.
60
61.. _`assertraises`:
62
63Assertions about expected exceptions
64------------------------------------------
65
66In order to write assertions about raised exceptions, you can use
67``pytest.raises`` as a context manager like this::
68
69    import pytest
70
71    def test_zero_division():
72        with pytest.raises(ZeroDivisionError):
73            1 / 0
74
75and if you need to have access to the actual exception info you may use::
76
77    def test_recursion_depth():
78        with pytest.raises(RuntimeError) as excinfo:
79            def f():
80                f()
81            f()
82        assert 'maximum recursion' in str(excinfo.value)
83
84``excinfo`` is a ``ExceptionInfo`` instance, which is a wrapper around
85the actual exception raised.  The main attributes of interest are
86``.type``, ``.value`` and ``.traceback``.
87
88.. versionchanged:: 3.0
89
90In the context manager form you may use the keyword argument
91``message`` to specify a custom failure message::
92
93     >>> with raises(ZeroDivisionError, message="Expecting ZeroDivisionError"):
94     ...     pass
95     ... Failed: Expecting ZeroDivisionError
96
97If you want to write test code that works on Python 2.4 as well,
98you may also use two other ways to test for an expected exception::
99
100    pytest.raises(ExpectedException, func, *args, **kwargs)
101    pytest.raises(ExpectedException, "func(*args, **kwargs)")
102
103both of which execute the specified function with args and kwargs and
104asserts that the given ``ExpectedException`` is raised.  The reporter will
105provide you with helpful output in case of failures such as *no
106exception* or *wrong exception*.
107
108Note that it is also possible to specify a "raises" argument to
109``pytest.mark.xfail``, which checks that the test is failing in a more
110specific way than just having any exception raised::
111
112    @pytest.mark.xfail(raises=IndexError)
113    def test_f():
114        f()
115
116Using ``pytest.raises`` is likely to be better for cases where you are testing
117exceptions your own code is deliberately raising, whereas using
118``@pytest.mark.xfail`` with a check function is probably better for something
119like documenting unfixed bugs (where the test describes what "should" happen)
120or bugs in dependencies.
121
122Also, the context manager form accepts a ``match`` keyword parameter to test
123that a regular expression matches on the string representation of an exception
124(like the ``TestCase.assertRaisesRegexp`` method from ``unittest``)::
125
126    import pytest
127
128    def myfunc():
129        raise ValueError("Exception 123 raised")
130
131    def test_match():
132        with pytest.raises(ValueError, match=r'.* 123 .*'):
133            myfunc()
134
135The regexp parameter of the ``match`` method is matched with the ``re.search``
136function. So in the above example ``match='123'`` would have worked as
137well.
138
139
140.. _`assertwarns`:
141
142Assertions about expected warnings
143-----------------------------------------
144
145.. versionadded:: 2.8
146
147You can check that code raises a particular warning using
148:ref:`pytest.warns <warns>`.
149
150
151.. _newreport:
152
153Making use of context-sensitive comparisons
154-------------------------------------------------
155
156.. versionadded:: 2.0
157
158``pytest`` has rich support for providing context-sensitive information
159when it encounters comparisons.  For example::
160
161    # content of test_assert2.py
162
163    def test_set_comparison():
164        set1 = set("1308")
165        set2 = set("8035")
166        assert set1 == set2
167
168if you run this module::
169
170    $ pytest test_assert2.py
171    =========================== test session starts ============================
172    platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
173    rootdir: $REGENDOC_TMPDIR, inifile:
174    collected 1 item
175
176    test_assert2.py F                                                    [100%]
177
178    ================================= FAILURES =================================
179    ___________________________ test_set_comparison ____________________________
180
181        def test_set_comparison():
182            set1 = set("1308")
183            set2 = set("8035")
184    >       assert set1 == set2
185    E       AssertionError: assert {'0', '1', '3', '8'} == {'0', '3', '5', '8'}
186    E         Extra items in the left set:
187    E         '1'
188    E         Extra items in the right set:
189    E         '5'
190    E         Use -v to get the full diff
191
192    test_assert2.py:5: AssertionError
193    ========================= 1 failed in 0.12 seconds =========================
194
195Special comparisons are done for a number of cases:
196
197* comparing long strings: a context diff is shown
198* comparing long sequences: first failing indices
199* comparing dicts: different entries
200
201See the :ref:`reporting demo <tbreportdemo>` for many more examples.
202
203Defining your own assertion comparison
204----------------------------------------------
205
206It is possible to add your own detailed explanations by implementing
207the ``pytest_assertrepr_compare`` hook.
208
209.. autofunction:: _pytest.hookspec.pytest_assertrepr_compare
210   :noindex:
211
212As an example consider adding the following hook in a :ref:`conftest.py <conftest.py>`
213file which provides an alternative explanation for ``Foo`` objects::
214
215   # content of conftest.py
216   from test_foocompare import Foo
217   def pytest_assertrepr_compare(op, left, right):
218       if isinstance(left, Foo) and isinstance(right, Foo) and op == "==":
219           return ['Comparing Foo instances:',
220                   '   vals: %s != %s' % (left.val, right.val)]
221
222now, given this test module::
223
224   # content of test_foocompare.py
225   class Foo(object):
226       def __init__(self, val):
227           self.val = val
228
229       def __eq__(self, other):
230           return self.val == other.val
231
232   def test_compare():
233       f1 = Foo(1)
234       f2 = Foo(2)
235       assert f1 == f2
236
237you can run the test module and get the custom output defined in
238the conftest file::
239
240   $ pytest -q test_foocompare.py
241   F                                                                    [100%]
242   ================================= FAILURES =================================
243   _______________________________ test_compare _______________________________
244
245       def test_compare():
246           f1 = Foo(1)
247           f2 = Foo(2)
248   >       assert f1 == f2
249   E       assert Comparing Foo instances:
250   E            vals: 1 != 2
251
252   test_foocompare.py:11: AssertionError
253   1 failed in 0.12 seconds
254
255.. _assert-details:
256.. _`assert introspection`:
257
258Advanced assertion introspection
259----------------------------------
260
261.. versionadded:: 2.1
262
263
264Reporting details about a failing assertion is achieved by rewriting assert
265statements before they are run.  Rewritten assert statements put introspection
266information into the assertion failure message.  ``pytest`` only rewrites test
267modules directly discovered by its test collection process, so asserts in
268supporting modules which are not themselves test modules will not be rewritten.
269
270.. note::
271
272   ``pytest`` rewrites test modules on import by using an import
273   hook to write new ``pyc`` files. Most of the time this works transparently.
274   However, if you are messing with import yourself, the import hook may
275   interfere.
276
277   If this is the case you have two options:
278
279   * Disable rewriting for a specific module by adding the string
280     ``PYTEST_DONT_REWRITE`` to its docstring.
281
282   * Disable rewriting for all modules by using ``--assert=plain``.
283
284   Additionally, rewriting will fail silently if it cannot write new ``.pyc`` files,
285   i.e. in a read-only filesystem or a zipfile.
286
287
288For further information, Benjamin Peterson wrote up `Behind the scenes of pytest's new assertion rewriting <http://pybites.blogspot.com/2011/07/behind-scenes-of-pytests-new-assertion.html>`_.
289
290.. versionadded:: 2.1
291   Add assert rewriting as an alternate introspection technique.
292
293.. versionchanged:: 2.1
294   Introduce the ``--assert`` option. Deprecate ``--no-assert`` and
295   ``--nomagic``.
296
297.. versionchanged:: 3.0
298   Removes the ``--no-assert`` and ``--nomagic`` options.
299   Removes the ``--assert=reinterp`` option.
300