Why Test API Documentation?

API documentation is a contract with your consumers. When the documentation says GET /users returns a JSON object with name and email, consumers build their integrations based on that promise. If the API actually returns username instead of name, every integration breaks.

Documentation testing catches these discrepancies before consumers find them in production:

  • Endpoints that exist in docs but were removed from the API.
  • Parameters that are required in the API but documented as optional.
  • Response fields that differ between docs and actual responses.
  • Status codes that the API returns but docs do not mention.

Types of Documentation Tests

1. Specification Validity

Verify the OpenAPI/Swagger spec itself is valid and well-formed:

# Validate OpenAPI spec syntax
npx @redocly/cli lint openapi.yaml

# Or using swagger-cli
npx swagger-cli validate openapi.yaml

Common issues:

  • Missing required fields in the spec.
  • Invalid JSON Schema references.
  • Duplicate operation IDs.
  • Missing response definitions for error codes.

2. Documentation Completeness

Verify every API endpoint is documented:

# Compare routes in your application against routes in the spec
# Express.js example
app._router.stack
  .filter(r => r.route)
  .map(r => `${Object.keys(r.route.methods)[0].toUpperCase()} ${r.route.path}`)

Checklist:

  • Every endpoint has a description.
  • Every parameter has a description and type.
  • Every response code has a description and schema.
  • Error responses (400, 401, 403, 404, 422, 500) are documented.
  • Authentication requirements are specified.
  • Request/response examples are provided.

3. Documentation Accuracy (Drift Detection)

The most valuable test: verify that the docs match the actual API behavior.

Using Dredd:

# Install Dredd
npm install -g dredd

# Run Dredd against your API
dredd openapi.yaml http://localhost:3000

Dredd reads the OpenAPI spec, makes real API calls, and reports discrepancies.

Using Schemathesis:

# Install Schemathesis
pip install schemathesis

# Run property-based testing against the spec
schemathesis run http://localhost:3000/openapi.json

# Or with specific checks
schemathesis run http://localhost:3000/openapi.json \
  --checks all \
  --validate-schema true

Schemathesis generates random valid requests based on the spec and verifies responses match the documented schemas. It can find edge cases that manual testing misses.

4. Example Validation

Verify that documented examples actually work:

// Parse OpenAPI spec
const spec = yaml.load(fs.readFileSync('openapi.yaml'));

// For each endpoint with examples
for (const [path, methods] of Object.entries(spec.paths)) {
  for (const [method, details] of Object.entries(methods)) {
    if (details.requestBody?.content?.['application/json']?.example) {
      const example = details.requestBody.content['application/json'].example;

      // Send the documented example to the API
      const response = await fetch(`http://localhost:3000${path}`, {
        method: method.toUpperCase(),
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify(example),
      });

      // Verify it returns the documented status code
      const expectedStatus = Object.keys(details.responses)[0];
      expect(response.status).toBe(parseInt(expectedStatus));
    }
  }
}

Common Documentation Bugs

Bug TypeExampleImpact
Wrong status codeDocs say 200, API returns 201 for POSTConsumers check wrong code
Missing fieldDocs show name, API returns usernameConsumers get undefined
Wrong typeDocs say string, API returns numberType errors in consumers
Missing error422 not documentedConsumers do not handle validation errors
Wrong authDocs say API key, API requires Bearer tokenAuth failures
Stale endpointDocs list /v1/users, API only has /v2/users404 errors

Automating Drift Detection in CI

Add documentation testing to your CI pipeline:

# GitHub Actions example
name: API Doc Testing
on: [push, pull_request]

jobs:
  doc-test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Start API server
        run: docker compose up -d
      - name: Validate spec syntax
        run: npx @redocly/cli lint openapi.yaml
      - name: Run Dredd
        run: npx dredd openapi.yaml http://localhost:3000
      - name: Run Schemathesis
        run: |
          pip install schemathesis
          schemathesis run http://localhost:3000/openapi.json --checks all

Exercise: API Documentation Audit

Setup

Choose an API to audit — either your own project’s API or a public API with an OpenAPI spec.

Task 1: Spec Validation

  1. Validate the OpenAPI spec with @redocly/cli lint.
  2. Document every warning and error.
  3. Fix the spec issues and re-validate.

Task 2: Completeness Audit

Review every endpoint in the spec and verify:

EndpointDescriptionParams DocumentedResponses DocumentedExamplesAuth Specified
GET /users
POST /users

Mark each as complete or incomplete. List specific gaps.

Task 3: Drift Detection

Run Dredd or Schemathesis against the API:

  1. Record every discrepancy found.
  2. Categorize each as: missing endpoint, wrong response, wrong status code, wrong schema, or missing error.
  3. For each discrepancy, determine if the bug is in the docs or the API.

Task 4: Error Documentation

Test every documented error response:

  1. Trigger each error condition (invalid input, missing auth, not found, etc.).
  2. Compare the actual error response against the documented format.
  3. Document any undocumented error responses you discover.

Task 5: Write a Documentation Test Suite

Create an automated test suite that:

  1. Validates spec syntax.
  2. Sends each documented example and verifies the response.
  3. Triggers each documented error and verifies the error response.
  4. Runs in CI on every pull request.

Deliverables

  1. Spec validation report with issues found and fixed.
  2. Completeness audit table.
  3. Drift detection report categorizing each discrepancy.
  4. Error documentation comparison table.
  5. Automated test suite code.