Why Accessibility Testing Matters

Accessibility testing ensures that people with disabilities can use your product. This includes users who are blind or have low vision, deaf or hard of hearing, have motor disabilities, cognitive disabilities, or temporary impairments (a broken arm, bright sunlight on a screen).

Roughly 15% of the world’s population — over 1 billion people — experience some form of disability. Beyond the ethical imperative, accessibility is increasingly a legal requirement. The Americans with Disabilities Act (ADA), European Accessibility Act (EAA), and similar laws worldwide mandate accessible digital products. Lawsuits over web accessibility have grown significantly, with over 4,000 ADA-related digital lawsuits filed in the US annually.

As a QA engineer, accessibility testing is not optional — it is a core quality attribute that you must verify.

WCAG 2.1: The Standard

The Web Content Accessibility Guidelines (WCAG) 2.1, published by the W3C, is the global standard for web accessibility. It is organized around four principles known as POUR.

POUR Principles

Perceivable — Information and UI components must be presentable to users in ways they can perceive.

  • Text alternatives for images (alt text)
  • Captions for video
  • Sufficient color contrast
  • Content adaptable to different presentations

Operable — UI components and navigation must be operable.

  • Keyboard accessible (all functionality via keyboard)
  • Enough time to read and use content
  • No content that causes seizures (flashing)
  • Navigable (clear headings, focus order, skip links)

Understandable — Information and UI operation must be understandable.

  • Readable text (language specified, clear writing)
  • Predictable behavior (consistent navigation, no unexpected changes)
  • Input assistance (error identification, labels, instructions)

Robust — Content must be robust enough to be interpreted by assistive technologies.

  • Valid HTML
  • Proper use of ARIA attributes
  • Compatible with current and future user agents

Conformance Levels

LevelDescriptionRequirement
AMinimum accessibilityMust-have basics (alt text, keyboard access, form labels)
AAStandard accessibilityMost legal requirements target this level (contrast, resize, captions)
AAAEnhanced accessibilityHighest standard (sign language, enhanced contrast, reading level)

Most organizations target Level AA compliance. Level AAA is aspirational and not typically required by law.

Common Accessibility Issues

Visual

  • Missing alt text: Images without descriptive alt attributes
  • Poor color contrast: Text-to-background contrast below 4.5:1
  • Color-only information: Using color alone to convey meaning (red = error, green = success)
  • Missing focus indicators: No visible outline when tabbing through interactive elements
  • Text in images: Critical text embedded in images rather than HTML

Keyboard Navigation

  • Keyboard traps: Focus gets stuck in a component with no way to tab out
  • Illogical tab order: Focus jumps around the page unexpectedly
  • Missing skip links: No way to skip past repetitive navigation to main content
  • Non-keyboard-accessible controls: Dropdowns, modals, or custom widgets that only work with a mouse

Screen Reader

  • Missing ARIA labels: Interactive elements without accessible names
  • Incorrect heading hierarchy: Jumping from <h1> to <h4> breaks navigation
  • Missing form labels: Input fields without associated <label> elements
  • Dynamic content not announced: AJAX updates that screen readers do not detect

Multimedia

  • Missing captions: Videos without synchronized text captions
  • No audio descriptions: Visual information in video not described audibly
  • Auto-playing media: Audio or video that starts automatically without user control

Automated Testing Tools

Automated tools catch approximately 30-40% of accessibility issues. The rest requires manual testing.

ToolTypeBest For
axe DevToolsBrowser extensionComprehensive automated scanning
LighthouseChrome built-inQuick accessibility score and audit
WAVEBrowser extensionVisual overlay of accessibility issues
Pa11yCLI toolCI/CD integration
axe-coreLibraryIntegration into automated test suites
Colour Contrast AnalyserDesktop appManual contrast checking

Using axe DevTools

  1. Install the axe DevTools Chrome extension
  2. Open Chrome DevTools (F12)
  3. Navigate to the “axe DevTools” tab
  4. Click “Scan ALL of my page”
  5. Review issues by severity (Critical, Serious, Moderate, Minor)

Using Lighthouse

  1. Open Chrome DevTools (F12)
  2. Go to the “Lighthouse” tab
  3. Check “Accessibility” category
  4. Click “Analyze page load”
  5. Review the score (0-100) and specific issues

Manual Testing Techniques

Keyboard Testing

The most important manual test. Navigate the entire page using only a keyboard:

  • Tab: Move forward through interactive elements
  • Shift+Tab: Move backward
  • Enter/Space: Activate buttons and links
  • Arrow keys: Navigate within components (dropdowns, radio buttons)
  • Escape: Close modals and popups

Check: Can you reach every interactive element? Is the focus visible? Can you escape every component? Is the tab order logical?

Screen Reader Testing

Test with at least one screen reader:

  • NVDA (Windows, free)
  • VoiceOver (macOS/iOS, built-in)
  • JAWS (Windows, commercial)
  • TalkBack (Android, built-in)

Listen for: Are images described? Are form fields labeled? Are headings announced? Do buttons have meaningful names?

Zoom Testing

Zoom the page to 200% and check:

  • Is all content still visible?
  • Does the layout adapt without horizontal scrolling?
  • Is text readable without overlapping?

Exercise: WCAG 2.1 AA Audit

Perform an accessibility audit of a public website targeting WCAG 2.1 Level AA compliance.

Task

Audit the website using both automated tools and manual techniques. Document at least 10 issues.

Steps

  1. Run axe DevTools scan and document all Critical and Serious issues
  2. Run Lighthouse accessibility audit and note the score
  3. Perform keyboard navigation test on the homepage and one form page
  4. Test with VoiceOver or NVDA on at least 3 pages
  5. Check color contrast on all text elements
  6. Verify all images have appropriate alt text
  7. Check heading hierarchy (h1 > h2 > h3 order)
  8. Test at 200% zoom
Hint: Audit Report Template
#IssueWCAG CriterionLevelTool/MethodSeverityLocationRecommendation
1Missing alt text on hero image1.1.1 Non-text ContentAaxeCriticalHomepage heroAdd descriptive alt text
2

Group issues by POUR principle for clarity.

Solution: Example Audit Report

Automated Scan Results (axe DevTools):

#IssueCriterionLevelSeverityCount
1Images missing alt text1.1.1ACritical5
2Form inputs without labels1.3.1ACritical3
3Color contrast insufficient1.4.3AASerious8
4Links with no discernible text4.1.2ASerious2
5Missing document language3.1.1ASerious1

Manual Testing Results:

#IssueCriterionLevelMethodSeverity
6Keyboard trap in date picker modal2.1.2AKeyboardCritical
7No skip navigation link2.4.1AKeyboardSerious
8Focus not visible on nav links2.4.7AAKeyboardSerious
9Heading hierarchy skips h2 to h41.3.1AScreen readerModerate
10Video without captions1.2.2AManual reviewCritical
11Error messages not associated with fields3.3.1AScreen readerSerious
12Horizontal scroll required at 200% zoom1.4.10AAZoom testSerious

Summary: Lighthouse score: 62/100. Found 12 issues: 3 Critical, 5 Serious, 2 Moderate, 2 Minor. The site fails WCAG 2.1 Level A on multiple criteria and Level AA on contrast and zoom.

Priority fixes: (1) Add alt text to all images, (2) Fix keyboard trap in date picker, (3) Add video captions, (4) Fix color contrast issues, (5) Add form labels.

Pro Tips

  • Automated Tests Catch Only 30-40%: Never rely solely on automated tools. Keyboard testing and screen reader testing are essential for catching the most impactful issues that affect real users.
  • Integrate axe-core into CI/CD: Add @axe-core/playwright or @axe-core/webdriverio to your automated test suite so accessibility regressions are caught before deployment.
  • Test with Real Users: If possible, include users with disabilities in your usability testing. They will find issues that no tool or sighted tester can identify.
  • ARIA is a Last Resort: ARIA attributes should supplement semantic HTML, not replace it. Using a <button> element is always better than a <div role="button">. Most accessibility issues come from misusing or overusing ARIA.
  • Accessibility is Not a One-Time Audit: Build accessibility checks into your regular testing cycle. Every new feature should be tested for keyboard access, screen reader compatibility, and color contrast before release.