Why Test API Documentation?
API documentation is a contract with your consumers. When the documentation says GET /users returns a JSON object with name and email, consumers build their integrations based on that promise. If the API actually returns username instead of name, every integration breaks.
Documentation testing catches these discrepancies before consumers find them in production:
- Endpoints that exist in docs but were removed from the API.
- Parameters that are required in the API but documented as optional.
- Response fields that differ between docs and actual responses.
- Status codes that the API returns but docs do not mention.
Types of Documentation Tests
1. Specification Validity
Verify the OpenAPI/Swagger spec itself is valid and well-formed:
# Validate OpenAPI spec syntax
npx @redocly/cli lint openapi.yaml
# Or using swagger-cli
npx swagger-cli validate openapi.yaml
Common issues:
- Missing required fields in the spec.
- Invalid JSON Schema references.
- Duplicate operation IDs.
- Missing response definitions for error codes.
2. Documentation Completeness
Verify every API endpoint is documented:
# Compare routes in your application against routes in the spec
# Express.js example
app._router.stack
.filter(r => r.route)
.map(r => `${Object.keys(r.route.methods)[0].toUpperCase()} ${r.route.path}`)
Checklist:
- Every endpoint has a description.
- Every parameter has a description and type.
- Every response code has a description and schema.
- Error responses (400, 401, 403, 404, 422, 500) are documented.
- Authentication requirements are specified.
- Request/response examples are provided.
3. Documentation Accuracy (Drift Detection)
The most valuable test: verify that the docs match the actual API behavior.
Using Dredd:
# Install Dredd
npm install -g dredd
# Run Dredd against your API
dredd openapi.yaml http://localhost:3000
Dredd reads the OpenAPI spec, makes real API calls, and reports discrepancies.
Using Schemathesis:
# Install Schemathesis
pip install schemathesis
# Run property-based testing against the spec
schemathesis run http://localhost:3000/openapi.json
# Or with specific checks
schemathesis run http://localhost:3000/openapi.json \
--checks all \
--validate-schema true
Schemathesis generates random valid requests based on the spec and verifies responses match the documented schemas. It can find edge cases that manual testing misses.
4. Example Validation
Verify that documented examples actually work:
// Parse OpenAPI spec
const spec = yaml.load(fs.readFileSync('openapi.yaml'));
// For each endpoint with examples
for (const [path, methods] of Object.entries(spec.paths)) {
for (const [method, details] of Object.entries(methods)) {
if (details.requestBody?.content?.['application/json']?.example) {
const example = details.requestBody.content['application/json'].example;
// Send the documented example to the API
const response = await fetch(`http://localhost:3000${path}`, {
method: method.toUpperCase(),
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(example),
});
// Verify it returns the documented status code
const expectedStatus = Object.keys(details.responses)[0];
expect(response.status).toBe(parseInt(expectedStatus));
}
}
}
Common Documentation Bugs
| Bug Type | Example | Impact |
|---|---|---|
| Wrong status code | Docs say 200, API returns 201 for POST | Consumers check wrong code |
| Missing field | Docs show name, API returns username | Consumers get undefined |
| Wrong type | Docs say string, API returns number | Type errors in consumers |
| Missing error | 422 not documented | Consumers do not handle validation errors |
| Wrong auth | Docs say API key, API requires Bearer token | Auth failures |
| Stale endpoint | Docs list /v1/users, API only has /v2/users | 404 errors |
Automating Drift Detection in CI
Add documentation testing to your CI pipeline:
# GitHub Actions example
name: API Doc Testing
on: [push, pull_request]
jobs:
doc-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Start API server
run: docker compose up -d
- name: Validate spec syntax
run: npx @redocly/cli lint openapi.yaml
- name: Run Dredd
run: npx dredd openapi.yaml http://localhost:3000
- name: Run Schemathesis
run: |
pip install schemathesis
schemathesis run http://localhost:3000/openapi.json --checks all
Exercise: API Documentation Audit
Setup
Choose an API to audit — either your own project’s API or a public API with an OpenAPI spec.
Task 1: Spec Validation
- Validate the OpenAPI spec with
@redocly/cli lint. - Document every warning and error.
- Fix the spec issues and re-validate.
Task 2: Completeness Audit
Review every endpoint in the spec and verify:
| Endpoint | Description | Params Documented | Responses Documented | Examples | Auth Specified |
|---|---|---|---|---|---|
| GET /users | |||||
| POST /users | |||||
| … |
Mark each as complete or incomplete. List specific gaps.
Task 3: Drift Detection
Run Dredd or Schemathesis against the API:
- Record every discrepancy found.
- Categorize each as: missing endpoint, wrong response, wrong status code, wrong schema, or missing error.
- For each discrepancy, determine if the bug is in the docs or the API.
Task 4: Error Documentation
Test every documented error response:
- Trigger each error condition (invalid input, missing auth, not found, etc.).
- Compare the actual error response against the documented format.
- Document any undocumented error responses you discover.
Task 5: Write a Documentation Test Suite
Create an automated test suite that:
- Validates spec syntax.
- Sends each documented example and verifies the response.
- Triggers each documented error and verifies the error response.
- Runs in CI on every pull request.
Deliverables
- Spec validation report with issues found and fixed.
- Completeness audit table.
- Drift detection report categorizing each discrepancy.
- Error documentation comparison table.
- Automated test suite code.