pytest Essentials
Chapter 2: pytest Essentials
Introduction to pytest Power Features
In Chapter 1, we learned unit testing fundamentals and basic pytest assertions. Now we'll explore pytest's advanced features that make it the most popular Python testing framework. These features—fixtures, parametrization, and markers—eliminate code duplication, enable data-driven testing, and organize test suites at scale.
Think of basic pytest as a calculator and advanced pytest as a programmable computer. The fundamentals work fine for simple cases, but complex test suites need sophisticated tools. A test suite with hundreds or thousands of tests requires organization, reusable components, and efficient ways to handle test data.
By mastering fixtures, you'll write DRY (Don't Repeat Yourself) tests with shared setup code. Parametrization lets you test the same logic with dozens of inputs without duplicating test code. Markers help you organize tests into categories and run subsets selectively. Together, these features transform testing from tedious repetition into an elegant, maintainable practice.
Fixtures: Setup and Teardown
Fixtures are pytest's dependency injection system—arguably its most powerful feature. They provide reusable test data, objects, or setup code. Instead of copying setup code into every test, you define it once as a fixture and pytest automatically provides it to tests that request it.
Without fixtures, tests become repetitive:
With fixtures, setup becomes reusable:
Fixtures eliminate duplication and ensure consistent setup/teardown across tests.
Basic Fixtures
Pytest automatically calls the fixture function and passes its return value to tests that request it.
Fixtures with Setup and Teardown
The yield statement splits setup from teardown. Code before yield runs before the test, code after runs after—even if the test fails.
Fixture Scopes and Dependency Injection
Fixture scope determines how often pytest creates the fixture. Choosing the right scope balances test isolation against performance.
Fixture Scopes Explained
function (default): Creates a new fixture instance before each test function. Ensures complete test isolation but can be slow for expensive setup.
class: Creates one instance shared by all methods in a test class. Methods in the class can affect shared state.
module: Creates one instance shared by all tests in the module (Python file). Faster for expensive setup but tests can interfere with each other if they modify the fixture.
session: Creates one instance for the entire pytest run. Perfect for very expensive resources like starting a database server, but requires careful handling of test isolation.
Example of different scopes:
@pytest.fixture(scope="function") # Default
def clean_database():
"""Fresh database for each test."""
db = Database(":memory:")
yield db
db.close()
@pytest.fixture(scope="module")
def expensive_model():
"""Load ML model once per module."""
return load_large_model() # Takes 5 seconds
@pytest.fixture(scope="session")
def test_server():
"""Start server once for all tests."""
server = TestServer()
server.start()
yield server
server.stop()Choosing the right scope:
- Use function when tests modify the fixture
- Use class for test classes sharing related state
- Use module for read-only expensive resources
- Use session for one-time setup like servers or large data loads
Dependency Injection in Practice
Fixtures can depend on other fixtures, creating a dependency graph pytest resolves automatically:
@pytest.fixture(scope="module")
def expensive_resource():
"""Created once per module, shared across tests."""
print("Creating expensive resource...")
return {"data": "loaded"}
@pytest.fixture
def user(expensive_resource):
"""Fixtures can depend on other fixtures."""
return {"name": "Alice", "resource": expensive_resource}
def test_user_has_resource(user):
"""Test receives both fixtures automatically."""
assert "resource" in user
assert user["resource"]["data"] == "loaded"Pytest handles the dependency graph automatically—fixtures are created in the right order.
Parametrization: Data-Driven Tests
Parametrization is pytest's solution to testing the same logic with multiple inputs. It transforms one test function into many test cases, each with different parameters. This is essential for thorough testing without code duplication.
The Problem Without Parametrization:
def test_square_of_2():
assert square(2) == 4
def test_square_of_3():
assert square(3) == 9
def test_square_of_4():
assert square(4) == 16
# ... dozens more similar testsThis violates DRY principles and creates maintenance nightmares. Change the assertion logic? Update every test.
The Solution With Parametrization:
@pytest.mark.parametrize("input,expected", [
(2, 4), (3, 9), (4, 16), (5, 25), (10, 100),
(0, 0), (-3, 9), (100, 10000)
])
def test_square(input, expected):
assert square(input) == expectedOne test function generates 8 test cases. Add more test data without touching test logic.
Basic Parametrization
@pytest.mark.parametrize("input,expected", [
(2, 4),
(3, 9),
(4, 16),
(5, 25),
])
def test_square(input, expected):
"""Test runs 4 times with different values."""
assert input ** 2 == expectedThis creates 4 separate tests, each with different data.
Multiple Parameters
Parametrizing Multiple Arguments
@pytest.mark.parametrize("x", [1, 2, 3])
@pytest.mark.parametrize("y", [10, 20])
def test_multiplication(x, y):
"""Generates 3 × 2 = 6 test cases."""
result = x * y
assert result > 0Stacking decorators creates the Cartesian product of parameters.
Test Markers and Custom Markers
As test suites grow, you need to organize and categorize tests. Markers are pytest's tagging system—they let you label tests and run subsets selectively. Want to run just smoke tests before deployment? Mark them with @pytest.mark.smoke and run pytest -m smoke.
Markers solve several problems:
- Selective execution: Run only integration tests, skip slow tests, etc.
- Categorization: Organize tests by type (unit, integration, smoke, regression)
- Conditional skipping: Skip tests on certain platforms or Python versions
- Test metadata: Attach information about expected failures or known bugs
Built-in Markers
@pytest.mark.skip(reason="Feature not implemented")
def test_future_feature():
pass
@pytest.mark.skipif(sys.version_info < (3, 10), reason="Requires Python 3.10+")
def test_new_syntax():
pass
@pytest.mark.xfail(reason="Known bug, fix in progress")
def test_buggy_feature():
assert False # Expected to failCustom Markers
Define markers in pytest.ini:
[pytest]
markers =
slow: marks tests as slow
integration: integration tests
smoke: smoke tests for quick validationUse them in tests:
Run selectively:
pytest -m smoke # Only smoke tests
pytest -m "not slow" # Skip slow tests
pytest -m "integration and not slow" # Combinationconftest.py: Shared Fixtures
conftest.py is pytest's special configuration file where you define fixtures, hooks, and plugins available to all tests in a directory and its subdirectories. It's pytest's way of sharing code across test files without explicit imports.
The name "conftest" is reserved by pytest—it automatically discovers and loads these files before running tests. You can have multiple conftest.py files at different directory levels, creating a hierarchy of shared fixtures.
Why use conftest.py?
- Centralize common fixtures used across multiple test files
- Avoid circular imports between test modules
- Organize fixtures by scope (project-wide vs. module-specific)
- Configure pytest plugins and hooks
- Define custom markers and command-line options
Project structure:
tests/
├── conftest.py # Fixtures for all tests
├── test_auth.py
└── api/
├── conftest.py # Additional fixtures for api tests
└── test_users.pyFixtures in tests/conftest.py are available to all tests. Fixtures in tests/api/conftest.py are available only to tests in the api directory and subdirectories.
conftest.py:
import pytest
@pytest.fixture
def db_connection():
"""Database fixture available to all tests."""
conn = create_connection()
yield conn
conn.close()
@pytest.fixture
def auth_user():
"""Authenticated user fixture."""
return {"id": 1, "name": "Test User", "role": "admin"}test_auth.py:
def test_user_login(db_connection, auth_user):
"""Tests automatically get conftest fixtures."""
result = login(db_connection, auth_user["name"])
assert result is TrueNo imports needed—pytest discovers conftest.py automatically.
Fixture Discovery Order
Pytest searches for fixtures in this order:
- Test file itself
- conftest.py in same directory
- conftest.py in parent directories
- Built-in pytest fixtures
Fixtures in inner scopes override outer scopes.
Built-in Fixtures
Pytest includes dozens of built-in fixtures providing common testing utilities. You don't need to define these—they're always available. Here are the most useful ones for everyday testing:
tmpdir and tmp_path - Temporary Directories
For tests that work with files, pytest provides temporary directories automatically cleaned up:
def test_file_creation(tmpdir):
"""tmpdir provides a py.path.local temporary directory."""
file = tmpdir.join("test.txt")
file.write("content")
assert file.read() == "content"
# Deleted automatically after test
def test_with_pathlib(tmp_path):
"""tmp_path provides pathlib.Path object (preferred)."""
file = tmp_path / "data.json"
file.write_text('{"key": "value"}')
assert file.exists()
# Deleted automatically after testcapsys - Capture Output
def test_print_output(capsys):
"""Capture stdout/stderr."""
print("Hello, World!")
captured = capsys.readouterr()
assert captured.out == "Hello, World!\n"monkeypatch - Mock/Patch
def test_environment_variable(monkeypatch):
"""Temporarily modify environment."""
monkeypatch.setenv("API_KEY", "test-key")
assert os.getenv("API_KEY") == "test-key"
# Original environment restored after testtmp_path - Path Object
def test_path_operations(tmp_path):
"""tmp_path is pathlib.Path object."""
file = tmp_path / "data.json"
file.write_text('{"key": "value"}')
assert file.exists()Fixture Best Practices
- Keep fixtures focused: One responsibility per fixture
- Use meaningful names:
authenticated_usernotfixture1 - Choose appropriate scope: Don't recreate expensive resources unnecessarily
- Minimize side effects: Each test should be independent
- Document fixtures: Explain what they provide and any setup required
Combining Features
Powerful tests combine fixtures, parametrization, and markers:
@pytest.fixture
def api_client():
return APIClient(base_url="http://test.api")
@pytest.mark.integration
@pytest.mark.parametrize("endpoint,expected_status", [
("/users", 200),
("/posts", 200),
("/invalid", 404),
])
def test_api_endpoints(api_client, endpoint, expected_status):
"""Parametrized integration test using fixture."""
response = api_client.get(endpoint)
assert response.status_code == expected_statusThis creates 3 integration tests, each using the api_client fixture with different endpoints.
Course Recommendations
Master advanced pytest techniques with these courses on paiml.com:
pytest Mastery
- Advanced fixtures and dependency injection patterns
- Parametrization strategies for comprehensive testing
- Custom plugins and hooks
- Test organization at scale
- Enroll at paiml.com
Test Automation with Python
- Building robust test suites
- CI/CD integration with pytest
- Parallel test execution with pytest-xdist
- Coverage analysis and reporting
- Enroll at paiml.com
Python Testing Patterns
- Fixture design patterns
- Mocking and patching strategies
- Testing async code
- Database testing with fixtures
- Enroll at paiml.com
Quiz
📝 Test Your Knowledge: pytest Essentials
Take this quiz to reinforce what you've learned in this chapter.