When people ask us whether they should use automated or manual accessibility testing, the honest answer is both. The real question is how to layer them so you get the most coverage for the least effort. Each approach has clear strengths and real limits.

What Automated Testing Can Do

Automated accessibility testing uses software to scan your pages and flag anything that breaks a WCAG rule. The big names are axe-core (what Ardor Accessibility runs on), WAVE, Lighthouse, and Pa11y.

Automated tools are great at catching:

  • Missing alt text on images.
  • Color contrast failures between text and background.
  • Missing form labels and ARIA attributes.
  • Heading hierarchy issues like skipped levels.
  • Missing language declaration on the HTML element.
  • Keyboard trap detection on interactive elements.
  • Duplicate ID attributes in the DOM.
  • Missing link and button names.

These are yes-or-no questions. Either the alt text exists or it does not. Either contrast hits 4.5:1 or it does not. A scanner can run hundreds of rules across thousands of pages in minutes, which no human team can match.

What Automated Testing Cannot Do

Years of industry research put automated coverage at roughly 30 to 40 percent of WCAG violations. The rest needs a human. Scanners struggle with:

  • Alt text quality. A tool can tell you alt text exists. It cannot tell you the alt text actually describes the image.
  • Logical reading order. Whether content flows in a way that makes sense to a screen reader.
  • Keyboard navigation usability. Whether the tab order is sensible, not just whether elements are focusable.
  • Complex widgets. Whether custom JavaScript components like carousels, modals, and date pickers actually work with assistive tech.
  • Video caption accuracy. Whether captions are synced, complete, and correct.
  • Cognitive accessibility. Whether content is in plain language, instructions are clear, and error recovery is intuitive.

The Role of Automated Remediation

Detection is only half the job. Automated remediation tools like the Ardor Accessibility widget go further and actually fix things in real time. On every page load, the widget:

  • Scans the DOM for violations.
  • Applies programmatic fixes (injecting ARIA labels, adjusting contrast, adding keyboard handlers).
  • Logs every remediation attempt with a timestamp.
  • Reports back on what it fixed and what it could not.

That log becomes a documented record of good-faith effort, which matters if anyone ever asks. It shows you are actively remediating barriers, including the ones you inherited from a third-party platform or old content.

A Practical Combined Approach

For most organizations, a three-layer setup hits the sweet spot.

Layer 1: Automated Remediation (Continuous)

Deploy an accessibility widget that runs on every page load and fixes common violations in real time. You get immediate coverage for the most frequent issues, and your compliance documentation builds itself in the background.

Layer 2: Automated Scanning (Monthly)

Run regular scans to track your overall compliance posture, catch new violations that sneak in with content changes, and build trend reports that show improvement over time.

Layer 3: Manual Testing (Quarterly or As Needed)

Do periodic manual testing, especially when you launch new features, redesign pages, or get a complaint. Manual testing should zero in on what automated tools miss: screen reader usability, keyboard navigation flow, and content quality.

A Quick Note on SEO

Every time an automated scanner flags missing alt text, skipped headings, or slow page loads, it is flagging something Google also cares about. Fixing those issues helps your search ranking as much as it helps a screen reader user. If you want the full picture, read How ADA Compliance Boosts Your SEO.

The Bottom Line

Automated testing cannot replace manual testing, and manual testing alone cannot keep up with a whole website. The best accessibility programs use automated tools for breadth and speed, and manual testing for depth and nuance.

Ardor Accessibility handles the first two layers, real-time automated remediation and monthly WCAG scanning, with enough documentation to support the third. Request a demo and we will show you how it looks on your site.