Reports and Metrics¶
pytest-routes can generate comprehensive HTML and JSON reports with detailed metrics about your route smoke tests. Reports include test results, coverage statistics, and performance timing data.
Quick Start¶
Generate an HTML report with the --routes-report flag:
pytest --routes --routes-app myapp:app --routes-report pytest-routes-report.html
This creates a standalone HTML file with:
Test summary (pass/fail counts)
Route-by-route results
Performance metrics per route
Coverage statistics
Report Types¶
HTML Reports¶
HTML reports provide a visual, interactive view of your test results.
# Generate HTML report
pytest --routes --routes-app myapp:app --routes-report report.html
# With custom title
pytest --routes --routes-app myapp:app \
--routes-report report.html \
--routes-report-title "API Smoke Tests - Production"
Features:
Responsive design (works on desktop and mobile)
Light and dark theme support
Sortable and filterable route table
Color-coded pass/fail status
Performance metrics visualization
JSON Reports¶
JSON reports are ideal for CI/CD integration and programmatic analysis.
# Generate JSON report
pytest --routes --routes-app myapp:app --routes-report-json results.json
# Generate both HTML and JSON
pytest --routes --routes-app myapp:app \
--routes-report report.html \
--routes-report-json results.json
JSON Structure:
{
"title": "pytest-routes Report",
"generated_at": "2025-01-15T10:30:00Z",
"duration_seconds": 45.3,
"summary": {
"total_routes": 25,
"passed_routes": 23,
"failed_routes": 2,
"skipped_routes": 0,
"pass_rate": 92.0
},
"coverage": {
"total_routes": 30,
"tested_routes": 25,
"coverage_percentage": 83.3,
"untested_routes": [
{"path": "/admin/settings", "method": "PUT"},
{"path": "/internal/health", "method": "GET"}
]
},
"routes": [
{
"route_path": "/users",
"method": "GET",
"passed": true,
"total_requests": 100,
"successful_requests": 100,
"failed_requests": 0,
"success_rate": 100.0,
"avg_time_ms": 12.5,
"min_time_ms": 5.2,
"max_time_ms": 45.8,
"status_codes": {"200": 100}
}
]
}
Configuration¶
CLI Options¶
Option |
Default |
Description |
|---|---|---|
|
|
Path for HTML report |
|
|
Path for JSON report |
|
|
Custom report title |
pyproject.toml Configuration¶
[tool.pytest-routes.report]
enabled = true
output_path = "pytest-routes-report.html"
json_output = "pytest-routes-report.json"
title = "API Route Tests"
include_coverage = true
include_timing = true
theme = "light" # or "dark"
Metrics Collected¶
Route Metrics¶
For each tested route, the following metrics are collected:
Metric |
Description |
|---|---|
|
Total number of test requests made |
|
Requests that passed validation |
|
Requests that failed validation |
|
Percentage of successful requests |
|
Average response time in milliseconds |
|
Minimum response time |
|
Maximum response time |
|
Count of each HTTP status code received |
|
List of error messages for failed requests |
Test Metrics¶
Aggregate metrics for the entire test run:
Metric |
Description |
|---|---|
|
Total number of routes tested |
|
Routes with no failures |
|
Routes with at least one failure |
|
Routes that were skipped |
|
Percentage of routes that passed |
|
Test run start timestamp |
|
Test run end timestamp |
|
Total test run duration |
Coverage Metrics¶
Route coverage statistics:
Metric |
Description |
|---|---|
|
Total routes in application |
|
Routes that were tested |
|
Routes that were not tested |
|
Percentage of routes tested |
Programmatic Usage¶
You can use the reporting module programmatically:
from pytest_routes.reporting import (
HTMLReportGenerator,
RouteMetrics,
RunMetrics,
CoverageMetrics,
calculate_coverage,
)
from pytest_routes.reporting.html import ReportConfig
# Configure the report
config = ReportConfig(
title="My API Tests",
include_coverage=True,
include_timing=True,
theme="dark",
)
# Create test metrics
test_metrics = RunMetrics()
# Record route metrics during testing
route = RouteInfo(path="/users", methods=["GET"], ...)
route_metrics = test_metrics.get_or_create_route_metrics(route)
route_metrics.record_request(
status_code=200,
time_ms=12.5,
success=True,
)
# Finish and calculate totals
test_metrics.finish()
# Calculate coverage
all_routes = [...] # All discovered routes
tested_routes = [...] # Routes that were tested
coverage = calculate_coverage(all_routes, tested_routes)
# Generate the report
generator = HTMLReportGenerator(config)
html = generator.generate(test_metrics, coverage)
# Write to file
generator.write_report("report.html", test_metrics, coverage)
generator.write_json("results.json", test_metrics, coverage)
CI/CD Integration¶
GitHub Actions¶
- name: Run smoke tests with report
run: |
pytest --routes \
--routes-app myapp:app \
--routes-report pytest-routes-report.html \
--routes-report-json pytest-routes-results.json
- name: Upload test report
uses: actions/upload-artifact@v4
if: always()
with:
name: pytest-routes-report
path: |
pytest-routes-report.html
pytest-routes-results.json
- name: Check pass rate
run: |
PASS_RATE=$(jq '.summary.pass_rate' pytest-routes-results.json)
if (( $(echo "$PASS_RATE < 95" | bc -l) )); then
echo "Pass rate $PASS_RATE% is below threshold of 95%"
exit 1
fi
GitLab CI¶
smoke_tests:
script:
- pytest --routes --routes-app myapp:app \
--routes-report pytest-routes-report.html \
--routes-report-json pytest-routes-results.json
artifacts:
paths:
- pytest-routes-report.html
- pytest-routes-results.json
reports:
junit: pytest-routes-results.json
Customization¶
Custom Report Title¶
pytest --routes --routes-app myapp:app \
--routes-report report.html \
--routes-report-title "Production API - Smoke Tests v2.0"
Theme Selection¶
Configure the theme in pyproject.toml:
[tool.pytest-routes.report]
theme = "dark" # Options: "light", "dark"
Selective Metrics¶
Control which metrics are included:
[tool.pytest-routes.report]
include_coverage = true # Include coverage statistics
include_timing = true # Include performance timing
See Also¶
CLI Options Reference - All command-line options
Configuration - pyproject.toml configuration
Schemathesis Integration - OpenAPI contract testing