Reporting

The reporting module provides HTML report generation and metrics collection for route smoke tests.

Metrics Classes

RouteMetrics

Collects metrics for a single route during testing.

class pytest_routes.reporting.RouteMetrics[source]

Bases: object

Performance and test metrics for a single route.

Variables:
  • route_path – The route path.

  • method – HTTP method.

  • total_requests – Total number of requests made.

  • successful_requests – Number of successful requests (2xx-4xx).

  • failed_requests – Number of failed requests (5xx or unexpected).

  • total_time_ms – Total time spent testing this route in milliseconds.

  • min_time_ms – Minimum request time in milliseconds.

  • max_time_ms – Maximum request time in milliseconds.

  • avg_time_ms – Average request time in milliseconds.

  • status_codes – Distribution of status codes.

  • errors – List of error messages encountered.

Parameters:
route_path: str
method: str
total_requests: int = 0
successful_requests: int = 0
failed_requests: int = 0
total_time_ms: float = 0.0
min_time_ms: float = inf
max_time_ms: float = 0.0
status_codes: dict[int, int]
errors: list[str]
property avg_time_ms: float

Calculate average request time.

property success_rate: float

Calculate success rate as a percentage.

property passed: bool

Check if all requests were successful.

record_request(status_code, elapsed_ms, *, success=True, error=None)[source]

Record metrics for a single request.

Parameters:
  • status_code (int) – HTTP status code returned.

  • elapsed_ms (float) – Time taken for the request in milliseconds.

  • success (bool) – Whether the request was successful.

  • error (str | None) – Error message if the request failed.

Return type:

None

to_dict()[source]

Convert metrics to dictionary for serialization.

Return type:

dict[str, Any]

__init__(route_path, method, total_requests=0, successful_requests=0, failed_requests=0, total_time_ms=0.0, min_time_ms=inf, max_time_ms=0.0, status_codes=<factory>, errors=<factory>)
Parameters:

RunMetrics

Aggregates metrics across all routes for a test run.

class pytest_routes.reporting.RunMetrics[source]

Bases: object

Aggregate test metrics across all routes.

Variables:
  • start_time – When testing started (Unix timestamp).

  • end_time – When testing ended (Unix timestamp).

  • routes – Metrics for individual routes.

  • total_routes – Total number of routes tested.

  • passed_routes – Number of routes that passed.

  • failed_routes – Number of routes that failed.

  • skipped_routes – Number of routes that were skipped.

Parameters:
start_time: float
end_time: float | None = None
routes: dict[str, RouteMetrics]
skipped_routes: int = 0
property total_routes: int

Total number of routes tested.

property passed_routes: int

Number of routes that passed.

property failed_routes: int

Number of routes that failed.

property total_requests: int

Total requests across all routes.

property total_time_ms: float

Total time spent testing in milliseconds.

property duration_seconds: float

Total test duration in seconds.

property pass_rate: float

Percentage of routes that passed.

get_or_create_route_metrics(route)[source]

Get or create metrics for a route.

Parameters:

route (RouteInfo) – The route to get metrics for.

Return type:

RouteMetrics

Returns:

RouteMetrics instance for the route.

finish()[source]

Mark testing as finished.

Return type:

None

to_dict()[source]

Convert metrics to dictionary for serialization.

Return type:

dict[str, Any]

__init__(start_time=<factory>, end_time=None, routes=<factory>, skipped_routes=0)
Parameters:

Metrics Functions

pytest_routes.reporting.aggregate_metrics(route_metrics)[source]

Aggregate individual route metrics into test metrics.

Parameters:

route_metrics (list[RouteMetrics]) – List of RouteMetrics to aggregate.

Return type:

RunMetrics

Returns:

Aggregated RunMetrics.

Coverage Classes

RouteCoverage

Tracks coverage for a single route.

class pytest_routes.reporting.RouteCoverage[source]

Bases: object

Coverage information for a single route.

Variables:
  • route_path – The route path.

  • method – HTTP method.

  • tested – Whether this route was tested.

  • test_count – Number of tests run against this route.

  • status_codes_seen – Status codes returned during testing.

  • parameters_tested – Parameter names that were tested.

  • body_tested – Whether request body was tested.

Parameters:
route_path: str
method: str
tested: bool = False
test_count: int = 0
status_codes_seen: set[int]
parameters_tested: set[str]
body_tested: bool = False
property coverage_score: float

Calculate coverage score (0-100).

mark_tested(status_code, parameters=None, *, has_body=False)[source]

Mark this route as tested.

Parameters:
  • status_code (int) – Status code returned.

  • parameters (set[str] | None) – Parameters that were tested.

  • has_body (bool) – Whether request body was included.

Return type:

None

to_dict()[source]

Convert to dictionary for serialization.

Return type:

dict[str, Any]

__init__(route_path, method, tested=False, test_count=0, status_codes_seen=<factory>, parameters_tested=<factory>, body_tested=False)
Parameters:

CoverageMetrics

Aggregates coverage across all routes.

class pytest_routes.reporting.CoverageMetrics[source]

Bases: object

Aggregate coverage metrics across all routes.

Variables:
  • total_routes – Total number of routes in the application.

  • tested_routes – Number of routes that were tested.

  • untested_routes – List of routes that were not tested.

  • route_coverage – Coverage information per route.

Parameters:
total_routes: int = 0
route_coverage: dict[str, RouteCoverage]
property tested_routes: int

Number of routes tested.

property untested_routes: list[str]

List of routes that were not tested.

property coverage_percentage: float

Overall route coverage as a percentage.

property average_coverage_score: float

Average coverage score across all tested routes.

add_route(route)[source]

Add a route to track coverage.

Parameters:

route (RouteInfo) – The route to track.

Return type:

RouteCoverage

Returns:

RouteCoverage instance for the route.

get_route_coverage(route)[source]

Get coverage for a specific route.

Parameters:

route (RouteInfo) – The route to get coverage for.

Return type:

RouteCoverage | None

Returns:

RouteCoverage or None if not tracked.

to_dict()[source]

Convert to dictionary for serialization.

Return type:

dict[str, Any]

__init__(total_routes=0, route_coverage=<factory>)
Parameters:

Coverage Functions

pytest_routes.reporting.calculate_coverage(all_routes, tested_routes)[source]

Calculate coverage metrics.

Parameters:
  • all_routes (list[RouteInfo]) – All routes in the application.

  • tested_routes (list[RouteInfo]) – Routes that were actually tested.

Return type:

CoverageMetrics

Returns:

CoverageMetrics with coverage information.

HTML Report Generation

HTMLReportGenerator

Generates HTML and JSON reports from test metrics.

class pytest_routes.reporting.HTMLReportGenerator[source]

Bases: object

Generate HTML reports for pytest-routes test results.

Parameters:

config (ReportConfig | None)

__init__(config=None)[source]

Initialize the report generator.

Parameters:

config (ReportConfig | None) – Report configuration options.

generate(metrics, coverage=None)[source]

Generate HTML report.

Parameters:
Return type:

str

Returns:

HTML content as a string.

write(metrics, coverage=None)[source]

Generate and write HTML report to file.

Parameters:
Return type:

Path

Returns:

Path to the written file.

to_json(metrics, coverage=None)[source]

Export metrics as JSON.

Parameters:
Return type:

str

Returns:

JSON string.

write_json(metrics, coverage=None, output_path=None)[source]

Write metrics as JSON file.

Parameters:
  • metrics (RunMetrics) – Test metrics to export.

  • coverage (CoverageMetrics | None) – Optional coverage metrics.

  • output_path (Path | str | None) – Path to write to (defaults to report path with .json).

Return type:

Path

Returns:

Path to the written file.

ReportConfig

Configuration for report generation.

class pytest_routes.reporting.html.ReportConfig[source]

Bases: object

Configuration for HTML report generation.

Variables:
  • output_path – Path to write the HTML report.

  • title – Title for the report.

  • include_charts – Whether to include charts.

  • include_details – Whether to include detailed route info.

  • theme – Color theme (‘light’ or ‘dark’).

Parameters:
output_path: Path | str = 'pytest-routes-report.html'
title: str = 'pytest-routes Test Report'
include_charts: bool = True
include_details: bool = True
theme: str = 'light'
__init__(output_path='pytest-routes-report.html', title='pytest-routes Test Report', include_charts=True, include_details=True, theme='light')
Parameters:

Usage Examples

Collecting Metrics

from pytest_routes.reporting import RunMetrics, RouteMetrics
from pytest_routes.discovery.base import RouteInfo

# Create test metrics container
test_metrics = RunMetrics()

# Get or create metrics for a route
route = RouteInfo(
    path="/users",
    methods=["GET"],
    path_params={},
    query_params={},
    body_type=None,
)
route_metrics = test_metrics.get_or_create_route_metrics(route)

# Record individual requests
route_metrics.record_request(
    status_code=200,
    time_ms=12.5,
    success=True,
)
route_metrics.record_request(
    status_code=500,
    time_ms=45.0,
    success=False,
    error="Internal server error",
)

# Finish collecting metrics
test_metrics.finish()

# Access aggregate statistics
print(f"Total routes: {test_metrics.total_routes}")
print(f"Pass rate: {test_metrics.pass_rate}%")
print(f"Duration: {test_metrics.duration_seconds}s")

Calculating Coverage

from pytest_routes.reporting import calculate_coverage, CoverageMetrics
from pytest_routes.discovery.base import RouteInfo

# All routes in the application
all_routes = [
    RouteInfo(path="/users", methods=["GET"], ...),
    RouteInfo(path="/users", methods=["POST"], ...),
    RouteInfo(path="/orders", methods=["GET"], ...),
]

# Routes that were actually tested
tested_routes = [
    RouteInfo(path="/users", methods=["GET"], ...),
    RouteInfo(path="/users", methods=["POST"], ...),
]

# Calculate coverage
coverage = calculate_coverage(all_routes, tested_routes)

print(f"Coverage: {coverage.coverage_percentage}%")
print(f"Untested routes: {len(coverage.untested_routes)}")

Generating Reports

from pytest_routes.reporting import HTMLReportGenerator, RunMetrics
from pytest_routes.reporting.html import ReportConfig

# Configure the report
config = ReportConfig(
    title="API Smoke Test Report",
    include_coverage=True,
    include_timing=True,
    theme="dark",
)

# Create generator
generator = HTMLReportGenerator(config)

# Generate HTML
html = generator.generate(test_metrics, coverage_metrics)

# Write HTML report
generator.write_report("report.html", test_metrics, coverage_metrics)

# Write JSON report
generator.write_json("results.json", test_metrics, coverage_metrics)

See Also