Skip to main content
The complete guide to accessibility audits

How to run an accessibility audit that actually finds everything.

Automated scanners catch only 30–40% of WCAG issues — the rest needs a human. This guide walks you through a hybrid WCAG-EM methodology: what tools do for you, what you still have to check yourself, and how to write a report your stakeholders can act on.

  • WCAG-EM based methodology
  • Hybrid automated + manual
  • All 3 surfaces web, PDF, Office
  • Report template included

What is an accessibility audit?

An accessibility audit is a structured evaluation of a digital product against a standard — usually WCAG 2.2 AA — to identify barriers for users with disabilities and document how to fix them.

Scope, not opinion

A real audit picks a defined sample of pages / documents / components and evaluates them against named success criteria — not a vague “we looked around”.

Verifiable findings

Every finding names the WCAG success criterion (e.g. 1.4.3 Contrast Minimum), quotes the offending content, and describes a concrete fix — not “improve accessibility”.

Actionable report

The output is a prioritised list developers and content authors can work from — severity, affected users, effort estimate, and remediation steps in plain language.

Evidence for compliance

For EU EAA, US ADA, UK PSBAR, AU DDA, CA AODA — a documented audit with dates and findings is what regulators and courts actually look at. Keep the report.

What a tool catches — and what it doesn't

Automated tools reliably catch 30–40% of WCAG issues. The rest is context, judgement, and lived experience. A competent audit always combines both.

Automated

What scanners can verify on their own

  • Missing alt attributes on images
  • Missing form labels and aria-labels
  • Text / background contrast ratios (4.5:1 / 3:1)
  • Heading hierarchy gaps (h2 → h4 skip)
  • Page language declaration
  • Duplicate IDs, invalid HTML structure
  • Link text patterns (“click here”, empty links)
  • Target size (24×24 CSS px, WCAG 2.5.8)
  • PDF tagged-PDF flag, document language, PDF/UA identifier
  • Document properties (title, author, metadata)
  • Embedded fonts, Unicode encoding
  • Tracked changes / hidden content / comments in Office files
Manual

What only a human can judge

  • Whether alt text is meaningful (not just present)
  • Whether the reading order is logical for the layout
  • Whether link purpose is clear from surrounding context
  • Keyboard operability of custom widgets (menus, modals)
  • Focus visibility and logical focus order
  • Screen reader announcement quality (NVDA / JAWS / VoiceOver)
  • Video captions accuracy & synchronisation
  • Audio descriptions for non-decorative visuals
  • Contrast on gradient, image, or video backgrounds
  • Whether error messages help the user recover
  • Cognitive load — language complexity, timing, distractions
  • Consistency of help mechanisms across pages (WCAG 3.2.6)

WCAGHub tools flag both sides. Issues that fail automatic checks go into Issues. Issues that pass automatically but still need a human eye go into a dedicated Manual Review tab — so you never miss the “automated green” blind spots.

The 8-step accessibility audit process

Loosely based on the W3C's WCAG-EM methodology, adapted for real product teams with websites, PDFs, and Office documents.

  1. Define scope

    Write down exactly what's in and out. For a website: list the URLs (or URL patterns). For documents: which files, from which folder, up to what date. For Office: which templates, which departments.

    Tool step: none yet — this is a writing exercise. A good scope is one paragraph long.

  2. Pick the standard

    Most audits target WCAG 2.2 AA. For public sector: EN 301 549 (EU), Section 508 (US), AODA (CA), PSBAR (UK), NZGWAS (NZ). For PDFs: add PDF/UA-1 (ISO 14289-1). Document which standard you picked and why — the report refers to it dozens of times.

    Tool step: in WCAGHub, pick your jurisdiction in the scan settings — it changes the report's law references and terminology. See the Compliance hub for the full list.

  3. Sample the right pages

    You can't audit every page of a 2,000-page site. WCAG-EM calls for a structured sample: home, key templates, common task flows (login, checkout, contact), and a handful of random pages. Aim for 15–25 URLs for medium sites.

    Tool step: run the Web Checker on each sampled URL. Each scan runs 77+ detections against WCAG 2.2 and returns an Issues list, a Passed list, and a Manual Review list.

  4. Scan PDFs against PDF/UA

    A website is only half the picture. Annual reports, forms, accessibility statements — anything downloadable is part of scope. PDF/UA-1 (ISO 14289-1) is the governing standard; the Matterhorn Protocol is the industry's detailed checklist on top of it.

    Tool step: run the PDF Checker. It runs 55 native checks plus VeraPDF (the reference ISO-14289-1 validator) and merges the results. See our PDF/UA & Matterhorn checklist for the mapping.

  5. Scan Office documents (DOCX / XLSX / PPTX)

    Government portals, internal intranets, and publisher workflows all generate Word, Excel, and PowerPoint files. These have their own accessibility rules — heading styles, table headers, slide titles, named sheet ranges — that PDF tools don't cover.

    Tool step: upload samples to the Document Checker. It runs 31 detections across all three Office formats, with format-specific checks (slide titles for PPTX, sheet structure for XLSX, tracked changes for DOCX).

  6. Do the manual passes

    For each sampled page or document, do at least four manual passes — this is where the other 60% of issues live.

    • Keyboard only. Unplug the mouse. Tab through every interactive element. Can you reach everything? Is the focus ring visible at every step? Can you escape modals with Esc?
    • Screen reader. Turn on NVDA (Windows), VoiceOver (Mac / iOS), or TalkBack (Android). Listen to how each page announces. Are headings spoken in the right order? Are form fields labelled?
    • Zoom & reflow. Zoom to 400% and resize to 320 CSS pixels wide. Does content reflow or do you get a horizontal scrollbar? Does any content disappear?
    • Cognitive pass. Read every page aloud. Is the language plain? Are error messages specific? Is the task flow consistent across pages (WCAG 3.2.6 Consistent Help, 3.3.7 Redundant Entry)?

    Tool step: WCAGHub reports include a Screen Reader Compatibility Score (0–100) per page and surface the items most likely to trip NVDA / JAWS / VoiceOver. Use that to prioritise the manual pass.

  7. Write the report

    A good audit report has: an executive summary, the scope, the standard used, a findings table grouped by severity, and per-finding details (WCAG reference, affected user group, steps to reproduce, fix suggestion, effort estimate).

    Tool step: every WCAGHub scan generates a branded, itself accessible PDF report with all of the above — plus an AI Summary that turns the findings into executive-level language and maps them to your selected jurisdiction's law references.

  8. Remediate, re-scan, repeat

    The first audit is just the start. Set a cadence — quarterly for active sites, after every release for template changes, monthly for high-risk public sector portals. Re-scan the same sample each time; track the score over time.

    Tool step: WCAGHub credits never expire — buy an Enterprise pack once, then spread scans over the year. Keep every report in version control as evidence.

Audit all three surfaces, not just the website

A real compliance audit covers every digital artefact the public can reach — not just HTML. That's why WCAGHub ships three tools on one account.

Web pages

77+ detections across WCAG 2.2 A & AA

  • All 6 new WCAG 2.2 criteria (Focus Not Obscured, Target Size, Dragging, Consistent Help, Redundant Entry, Accessible Authentication)
  • Real Chrome headless rendering — catches CSS-computed contrast issues JS-only tools miss
  • Framework-aware fixes (React, Vue, Angular, WordPress, Bootstrap, Tailwind)
  • Screen reader compatibility score (NVDA / JAWS / VoiceOver)
  • Smart prioritiser: impact + fix difficulty + time estimate per issue
Open Web Checker

PDF documents

55 native checks + 106 VeraPDF (ISO 14289-1)

  • VeraPDF integration — the reference PDF/UA-1 validator used by national archives
  • 11 categories: structure, headings, images, tables, links, forms, contrast, fonts, metadata, content, navigation
  • Fix-tool guidance per issue: Adobe Acrobat Pro, MS Word, Adobe InDesign, LibreOffice
  • Maps to the 31-checkpoint Matterhorn Protocol — see our full mapping
  • AI-powered VeraPDF output translator — plain English for cryptic ISO error codes
Open PDF Checker

Office documents

31 detections across DOCX, XLSX, PPTX

  • Three formats in one tool — most competitors only handle Word
  • Format-specific rules: slide titles for PPTX, sheet structure for XLSX, tracked changes for DOCX
  • 11 categories including headings, images, tables, lists, fonts, navigation
  • AI fix guidance that knows Word / Excel / PowerPoint ribbons (not generic advice)
  • Detects hidden content, comments, and other reviewer leftovers
Open Document Checker

Five audit mistakes we see all the time

Most “accessibility audits” that end up in a legal file got one of these wrong.

Running one scanner and calling it done

A single automated scan catches 30–40% of issues at best. If your audit report doesn't include a keyboard pass, a screen-reader pass, and a zoom pass, it isn't an audit — it's a linter output.

Scanning only the homepage

Home pages are usually the most polished. The login, search results, PDF downloads, account settings, and checkout — that's where most barriers live. Always sample across templates, not just the front page.

Ignoring PDFs and Office documents

EAA, EN 301 549, Section 508 and AODA all explicitly include downloadable documents. “We made the website WCAG-AA” is not compliance if your annual report PDF is untagged and your policy DOCX has no heading styles.

Writing findings without WCAG references

“Add alt text to images” is not actionable at scale. “WCAG 1.1.1 (Level A): 14 images on /products/ are missing alt attributes, e.g. <img src="hero.jpg">” is.

Treating the audit as a one-off

Every release reintroduces regressions. A single audit is a snapshot, not a programme. Set a re-scan cadence and track score over time — that's the number your board actually cares about.

Frequently asked questions

The questions we get most often from agencies, public sector buyers, and in-house compliance teams.

How long does an accessibility audit take?

For a 20-URL site plus a handful of PDFs / documents, budget 2–5 working days — one day for scans and triage, 1–3 for manual passes, and one for writing up. Large portfolios or complex web apps can take weeks.

Do I have to use WCAG 2.2, or is WCAG 2.1 still fine?

WCAG 2.2 is the current W3C recommendation and supersedes 2.1. EN 301 549 v4 (Sep 2024) references it. New audits should target 2.2 AA. Existing 2.1 policies remain valid for their term, but any new procurement should say 2.2.

Is an automated scan enough for compliance?

No. EU EAA, US ADA case law, and AU DDA all assume the tested standard is WCAG, which contains criteria that cannot be judged automatically (e.g. 1.1.1 for meaningful alt text, 2.4.2 for descriptive titles). A defensible audit always combines automated and manual evaluation.

Who should run the audit — internal team or external?

Both work. Internal teams are cheaper and build capability; external audits add independence, useful for regulated sectors. Many organisations run quarterly internal audits plus one external per year. WCAGHub works for both — buy credits once, scan as often as you need.

What about VPATs and Accessibility Conformance Reports?

A VPAT (Voluntary Product Accessibility Template) and the ITI's Accessibility Conformance Report (ACR) map each WCAG / Section 508 criterion to a conformance level (Supports, Partially Supports, Does Not Support). A WCAGHub audit report gives you the underlying evidence; the VPAT is the summary document built from it.

Which jurisdictions does WCAGHub support?

Six market jurisdictions: AU (DDA), US (ADA / Section 508), EU (EAA / EN 301 549), UK (Equality Act / PSBAR), CA (ACA / AODA), and Global (WCAG 2.2 as voluntary baseline). The report's law references and terminology change based on the jurisdiction you pick. See the Compliance hub.

Can I re-scan after we fix things?

Yes — each re-scan uses one credit. Since credits never expire, many teams buy a volume pack and re-scan after every release. See pricing and Enterprise packs.

Does the audit cover mobile apps?

WCAGHub tools cover web (including responsive and PWAs), PDFs, and Office files. Native iOS and Android apps need platform-specific testing (XCTest / Espresso accessibility APIs) — that's not in scope for our tools today. The methodology in this guide still applies.