pytest [File or Directory]

No description

Arguments

NameDescription
File or DirectoryThe test file or directory you want to run pytest on. If omitted, pytest will run all of the files of the form test_*.py or *_test.py in the current directory and its subdirectories

Options

NameDescription
--assert <Mode>Control assertion debugging tools. 'plain' performs no assertion debugging. 'rewrite' (the default) rewrites assert statements in test modules on import to provide assert expression information
--basetemp <Directory>Base temporary directory for this test run.(warning: this directory is removed if it exists)
-c <File>Load configuration from `file` instead of trying to locate one of the implicit configuration files
--cache-clearRemove all cache contents at start of test run
--cache-show [Glob]Show cache contents, don't perform collection or tests. Optional argument: glob (default: '*')
--capture <Method>Per-test capturing method
--code-highlight <Highlight>Whether code should be highlighted (only if --color is also enabled)
--co, --collect-onlyOnly collect tests, don't execute them
--collect-in-virtualenvDon't ignore tests in a local virtualenv directory
--color <Color>Color terminal output
--confcutdir <Dir>Only load conftest.py's relative to specified dir
--continue-on-collection-errorsForce test execution even if collection errors occur
--debug [Debug File Name]Store internal tracing debug information in this log file. This file is opened with 'w' and truncated as a result, care advised. Defaults to 'pytestdebug.log'
--durations <N>Show N slowest setup/test durations (N=0 for all)
--durations-min <N>Minimal duration in seconds for inclusion in slowest list
--deselect <nodeid_prefix>
  • Repeatable ♾
--disable-warnings, --disable-pytest-warningsDisable warnings summary
--doctest-continue-on-failureFor a given doctest, continue to run after the first failure
--doctest-ignore-import-errorsIgnore doctest ImportErrors
--doctest-modulesRun doctests in all .py modules
--doctest-report <Output format>Choose another output format for diffs on doctest failure
--doctest-glob <Pattern>Doctests file matching pattern, default: test*.txt
--exitfirst, -xExit instantly on first error or failed test
--failed-first, --ffRun all tests, but run the last failures first
--fixturesShows builtin and custom fixtures. Note that this command omits fixtures with leading _ unless the -v option is added
--fixtures-per-testShow fixtures per test
--full-traceDon't cut any tracebacks (default is to cut)
--help, -hThis shows help on command line and config-line options
--ignore <Path>
  • Repeatable ♾
--ignore-glob <Path>
  • Repeatable ♾
--import-mode <Mode>Prepend/append to sys.path when importing test modules and conftest files, default is to prepend
--junit-xml <Path>Create junit-xml style report file at given path
--junit-prefix <Str>Prepend prefix to classnames in junit-xml output
-k <Expression>Only run tests which match the given substring expression. An expression is a python evaluatable expression where all names are substring-matched against test names and their parent classes. Example: -k 'test_method or test_other' matches all test functions and classes whose name contains 'test_method' or 'test_other', while -k 'not test_method' matches those that don't contain 'test_method' in their names. -k 'not test_method and not test_other' will eliminate the matches. Additionally keywords are matched to classes and functions containing extra names in their 'extra_keyword_matches' set, as well as functions which have names assigned directly to them. The matching is case- insensitive
--keep-duplicatesKeep duplicate tests
--showlocals, -lShow locals in tracebacks (disabled by default)
--last-failed-no-failures, --lfnf <Tests>Which tests to run with no previously (known) failures
--last-failed, --lfRerun only the tests that failed at the last run (or all if none failed)
--log-auto-indent <Log Auto Indent Setting>Auto-indent multiline messages passed to the logging module. Accepts true|on, false|off or an integer
--log-cli-level <Log CLI Level>Cli logging level
--log-cli-format <Log CLI Format>Log format as used by the logging module
--log-cli-date-format <Log CLI Date Format>Log date format as used by the logging module
--log-date-format <Log Date Format>Log date format as used by the logging module
--log-format <Log Format>Log format as used by the logging module
--log-file <Log File Path>Path to a file where logging will be written to
--log-file-level <Log File Level>Log file logging level
--log-file-date-format <Log File Date Format>Log date format as used by the logging module
--log-file-format <Log File Format>Log format as used by the logging module
--log-level <Level>Level of messages to catch/display. Not set by default, so it depends on the root/parent log handler's effective level, where it is `WARNING` by default
-m <Mark Expression>Only run tests matching given mark expression
--markersShow markers (builtin, plugin and per-project ones)
--maxfail <num>Exit after first num failures or errors
--new-first, --nfRun tests from new files first, then the rest of the tests sorted by file mtime
--noconftestDon't load any conftest.py files
--no-headerDisable header
--no-summaryDisable summary
--override-ini, -o <Override INI>Override ini option with `option=value` style`
-p <Plugin name>
  • Repeatable ♾
--pastebin <mode>Send failed|all info to bpaste.net pastebin service
--pdbStart the interactive Python debugger on errors or KeyboardInterrupt
--pdbcls <modulename:classname>Specify a custom interactive Python debugger for use with --pdb
--pyargsTry to interpret all arguments as python packages
--quiet, -qDecrease verbosity
-r <chars>Show extra test summary info as specified by chars: (f)ailed, (E)rror, (s)kipped, (x)failed, (X)passed, (p)assed, (P)assed with output, (a)ll except passed (p/P), or (A)ll. (w)arnings are enabled by default (see --disable-warnings), 'N' can be used to reset the list. (default: 'fE')
--rootdir <Root Dir>Define root directory for tests. Can be relative path: 'root_dir', './root_dir', 'root_dir/another_dir/'; absolute path: '/home/user/root_dir'; path with variables:'$HOME/root_dir'
--runxfailReport the results of xfail tests as if they were not marked
-sShortcut for --capture=no
--setup-onlyOnly setup fixtures, do not execute tests
--setup-showShow setup of fixtures while executing tests
--setup-planShow what fixtures and tests would be executed but don't execute anything
--show-capture <Capture method>Controls how captured stdout/stderr/log is shown on failed tests
--stepwise, --swExit on test failure and continue from last failing test next time
--stepwise-skip, --sw-skipIgnore the first failing test but stop on the next failing test
--strictAlias to --strict-markers
--strict-configAny warnings encountered while parsing the `pytest` section of the configuration file raise errors
--strict-markersMarkers not registered in the `markers` section of the configuration file raise errors
--tb <Traceback print mode>Traceback print mode
--traceImmediately break when running each test
--trace-configTrace considerations of conftest.py files
--verbose, -v
  • Repeatable ♾
--verbosity <Verbosity level>Set verbosity. Default is 0
--version, -VDisplay pytest version and information about plugins. When given twice, also display information about plugins
--pythonwarnings, -W <Warnings to report>Set which warnings to report, see -W option of python itself