Working with non-python tests¶
A basic example for specifying tests in Yaml files¶
Here is an example conftest.py
(extracted from Ali Afshnars special purpose pytest-yamlwsgi plugin). This conftest.py
will collect test*.yml
files and will execute the yaml-formatted content as custom tests:
# content of conftest.py
import pytest
def pytest_collect_file(parent, path):
if path.ext == ".yml" and path.basename.startswith("test"):
return YamlFile(path, parent)
class YamlFile(pytest.File):
def collect(self):
import yaml # we need a yaml parser, e.g. PyYAML
raw = yaml.safe_load(self.fspath.open())
for name, spec in sorted(raw.items()):
yield YamlItem(name, self, spec)
class YamlItem(pytest.Item):
def __init__(self, name, parent, spec):
super(YamlItem, self).__init__(name, parent)
self.spec = spec
def runtest(self):
for name, value in sorted(self.spec.items()):
# some custom test execution (dumb example follows)
if name != value:
raise YamlException(self, name, value)
def repr_failure(self, excinfo):
""" called when self.runtest() raises an exception. """
if isinstance(excinfo.value, YamlException):
return "\n".join(
[
"usecase execution failed",
" spec failed: %r: %r" % excinfo.value.args[1:3],
" no further details known at this point.",
]
)
def reportinfo(self):
return self.fspath, 0, "usecase: %s" % self.name
class YamlException(Exception):
""" custom exception for error reporting. """
You can create a simple example file:
# test_simple.yml
ok:
sub1: sub1
hello:
world: world
some: other
and if you installed PyYAML or a compatible YAML-parser you can now execute the test specification:
nonpython $ pytest test_simple.yml
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collected 2 items
test_simple.yml F. [100%]
================================= FAILURES =================================
______________________________ usecase: hello ______________________________
usecase execution failed
spec failed: 'some': 'other'
no further details known at this point.
==================== 1 failed, 1 passed in 0.12 seconds ====================
You get one dot for the passing sub1: sub1
check and one failure.
Obviously in the above conftest.py
you’ll want to implement a more
interesting interpretation of the yaml-values. You can easily write
your own domain specific testing language this way.
Note
repr_failure(excinfo)
is called for representing test failures.
If you create custom collection nodes you can return an error
representation string of your choice. It
will be reported as a (red) string.
reportinfo()
is used for representing the test location and is also
consulted when reporting in verbose
mode:
nonpython $ pytest -v
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y -- $PYTHON_PREFIX/bin/python3.6
cachedir: .pytest_cache
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collecting ... collected 2 items
test_simple.yml::hello FAILED [ 50%]
test_simple.yml::ok PASSED [100%]
================================= FAILURES =================================
______________________________ usecase: hello ______________________________
usecase execution failed
spec failed: 'some': 'other'
no further details known at this point.
==================== 1 failed, 1 passed in 0.12 seconds ====================
While developing your custom test collection and execution it’s also interesting to just look at the collection tree:
nonpython $ pytest --collect-only
=========================== test session starts ============================
platform linux -- Python 3.x.y, pytest-3.x.y, py-1.x.y, pluggy-0.x.y
rootdir: $REGENDOC_TMPDIR/nonpython, inifile:
collected 2 items
<Package '$REGENDOC_TMPDIR/nonpython'>
<YamlFile 'test_simple.yml'>
<YamlItem 'hello'>
<YamlItem 'ok'>
======================= no tests ran in 0.12 seconds =======================