What's in an Accessibility Audit Report? A Complete Breakdown
You have run an accessibility audit on your website — or you are considering purchasing one. But what exactly should you expect to receive? An accessibility audit report is more than a list of errors. At its best, it is a complete resolution toolkit that tells you exactly what is wrong, why it matters, and how to fix it — in language both a business owner and a developer can act on.
Not all reports are created equal, though. Some are little more than raw scanner output wrapped in a PDF. Others are genuinely useful documents that can drive real remediation. This guide breaks down every section you should find in a professional accessibility audit report, explains what each section does, and shows you how to use the report to actually fix your site.
1. Executive Summary
The executive summary is the most important section for business owners, stakeholders, and anyone who needs to understand the findings without reading the full technical report. A good executive summary answers three questions in plain English:
- How many issues were found? Not just the raw count of individual violation instances, but the number of unique problems. A site might have 500 instances of missing alt text, but that is one problem — images are being published without text alternatives. The summary should distinguish between the total instance count and the number of distinct issues.
- How severe are they? A breakdown by severity level — critical, serious, moderate, minor — gives an immediate sense of the compliance posture. A site with two critical issues and twelve serious ones is in very different shape than a site with three hundred moderate warnings.
- What needs to happen next? The summary should frame the path forward: which issues you can fix yourself, which require a developer, and which originate from third-party tools (like embedded widgets, chat plugins, or platform code) that you will need to raise with your vendors.
What to look for: The best executive summaries translate technical findings into business language. Instead of "42 instances of WCAG SC 1.1.1 failures," it should say something like "42 images on your site have no text description, which means screen reader users cannot understand them — and missing alt text is the most commonly cited violation in ADA lawsuits." If your report's summary reads like raw data, it is not doing its job.
2. Violation Inventory with Severity Levels
The core of any audit report is the violation inventory — a structured list of every accessibility issue found during the scan. Each violation entry should include:
- Rule name and description: What the issue is, in both technical and plain-English terms. For example, "image-alt: Images must have alternate text" alongside a human-readable explanation like "Product photos on your homepage do not have text descriptions."
- Severity level: Critical, serious, moderate, or minor. These levels are defined by the axe-core accessibility engine based on the impact on users with disabilities. Critical and serious violations are the ones most likely to trigger legal action and most likely to block real users.
- Instance count: How many times this specific violation appears across the pages tested. This helps gauge the scope of the problem — is it an isolated incident or a systemic template issue?
- Affected pages: Which specific URLs contain this violation. This is essential for targeted remediation, especially on larger sites where different page templates may have different issues.
A well-organized violation inventory groups related issues together rather than listing every single instance separately. If your site has 200 images missing alt text across 15 pages, the report should present that as one issue with 200 instances — not 200 separate line items.
3. WCAG Criteria Mapping
Every violation in a professional report should reference the specific WCAG 2.1 success criterion it violates. This mapping serves several critical purposes:
- Legal documentation: Courts, settlement agreements, and the Department of Justice all reference WCAG 2.1 Level AA. A report that maps findings to specific success criteria (e.g., SC 1.1.1 Non-text Content, SC 1.4.3 Contrast Minimum, SC 4.1.2 Name, Role, Value) provides evidence that your audit was conducted against the correct standard.
- Developer clarity: WCAG success criteria are precise, well-documented specifications. When a developer sees "SC 2.4.7 Focus Visible" cited, they can look up the exact requirement, read the understanding document, and find technique-specific implementation guidance on the W3C website.
- Remediation verification: After fixes are implemented, the WCAG mapping makes re-testing straightforward. You can verify that the specific criteria previously failed now pass.
Red flag: If an audit report does not reference specific WCAG success criteria, it is not a real accessibility audit. Vague statements like "your site has color issues" or "some images need attention" are not actionable and provide no legal documentation value. Every finding should tie back to a numbered criterion.
4. Remediation Guidance with Code Examples
This is where a good report earns its value. Identifying problems is only half the job — the other half is showing you exactly how to fix them. Professional remediation guidance includes:
Plain-English Description for the Business Owner
Each issue should have a description that a non-technical person can understand. For example: "The white text on the tan background in your hero section is hard to read for people with low vision. The contrast ratio is 2.8:1 — WCAG requires at least 4.5:1." This helps the business owner understand why the fix matters and communicate the priority to their team.
Technical Instructions for the Developer
Each issue should also have specific, developer-ready instructions: what HTML element to change, what attribute to add or modify, and what the corrected code should look like. The best reports include before-and-after code blocks showing the current (non-compliant) code alongside the corrected version. For color contrast issues, the report should suggest specific WCAG-compliant replacement colors.
Scope and Effort Estimates
Good remediation guidance indicates whether a fix is a one-time template change (fix it once, it applies everywhere) or a page-by-page effort (each instance needs individual attention). It also estimates the relative effort: is this a five-minute CSS change, or a multi-hour refactor of a custom component? This helps development teams plan their sprints and prioritize work.
For example, a remediation entry for missing form labels might look like this:
Current code: <input type="email" placeholder="Your email">
Fixed code: <label for="email">Email address</label><input id="email" type="email" placeholder="Your email">
That level of specificity — showing the actual fix, not just describing the problem — is what separates a useful report from a useless one.
5. Third-Party vs. First-Party Issue Separation
Modern websites are built from a combination of your own code and third-party tools: embedded review widgets, live chat plugins, analytics scripts, payment forms, social media embeds, and platform-level code from Shopify, WordPress, Squarespace, or Wix. Many accessibility violations originate from these third-party components — and the fix process is fundamentally different.
- First-party issues — violations in your own HTML, CSS, and JavaScript — are under your direct control. Your developer can fix them immediately by editing your site's code or templates.
- Third-party issues — violations in embedded widgets, plugins, or platform code — require you to contact the vendor. You cannot fix their code directly. You need to raise the issue with them and request a fix or update.
A professional report should clearly identify which violations come from third-party sources and, ideally, group them by vendor. The best reports — like those from ADA Audit Report — go a step further and include copy-paste email templates you can send to each vendor, with the specific accessibility issues cited and questions to ask about their remediation timeline.
Why this matters legally: If you receive a demand letter, being able to demonstrate that certain violations originate from third-party code — and that you have already contacted those vendors requesting fixes — shows good faith and may limit your liability for issues outside your direct control.
6. Page-by-Page Results
While the violation inventory groups issues by type, the page-by-page results section shows what was found on each specific URL that was tested. This section serves as both a verification record and a remediation map.
For each page tested, you should see:
- The URL that was scanned
- The number of violations found on that page, broken down by severity
- The specific rules that failed on that page
- The HTML elements involved (for instance, the exact
<img>tag missing alt text)
This section is especially valuable for larger sites where different page types have different issues. Your homepage might pass color contrast checks while your blog template fails them. Your contact page might have properly labeled forms while your checkout flow does not. Page-by-page results let your development team address issues template by template rather than trying to fix everything at once.
7. Compliance Score or Rating
Many audit reports include an overall compliance score — a percentage or grade that gives a quick snapshot of where the site stands. While no single number can capture the full complexity of WCAG conformance, a well-calibrated score is useful for:
- Benchmarking: Establishing a baseline that you can compare against after remediation. "We went from 45% to 89% compliance" is a concrete metric to share with stakeholders.
- Prioritization: A score weighted by severity helps you understand whether your site has a few serious problems or many minor ones — two very different situations that require different approaches.
- Communication: Executives and non-technical stakeholders respond to scores. A clear rating makes it easier to secure budget and prioritize remediation work.
Be cautious of any provider that claims a score of "100%" or "fully compliant." Automated tools can only test 30-40% of WCAG criteria. A perfect automated score means you passed every testable criterion, not that your site is fully accessible. The remaining criteria require human judgment — things like whether alt text is actually meaningful, whether reading order makes sense, and whether custom widgets are truly usable with assistive technology.
How to Use Your Audit Report
Having a thorough report is only valuable if you act on it. Here is a practical workflow for turning an audit report into a fixed website:
- Read the executive summary first. Understand the scope of the problem before diving into details. Share the summary with anyone who needs to approve remediation work or allocate developer time.
- Share the full report with your developer. The remediation guidance section is written for them. A developer who receives a report with before-and-after code examples and affected page URLs can start fixing issues immediately — no additional research required.
- Prioritize critical and serious violations. These are the issues most likely to trigger legal action and most likely to block real users. Use the prioritized fix list to tackle the highest-impact items first. Many reports rank fixes by a combination of severity, frequency, and effort.
- Contact third-party vendors about their issues. Send the vendor-specific findings (or use the provided email templates) to each vendor whose code is generating violations. Document this communication — it demonstrates good faith.
- Fix template-level issues first. Many violations are systemic: a missing
langattribute on the<html>tag, a site-wide navigation without proper ARIA labels, or a footer template with low-contrast text. Fixing these once resolves the issue across every page on the site. - Re-test after fixes. Run another audit scan after implementing fixes to verify they worked and to catch any regressions. The WCAG criteria mapping from the original report makes verification straightforward — check that each previously failed criterion now passes.
- Keep the report on file. Your audit report is a legal document. If you ever receive a demand letter, this report — along with records of the fixes you made — demonstrates proactive compliance efforts and good faith. Courts consistently view documented remediation favorably.
What Bad Audit Reports Look Like
Unfortunately, the accessibility industry has its share of low-quality reports. Here are the warning signs of a report that will not help you:
-
Raw scanner output in a PDF wrapper. If the report looks like someone ran axe-core or WAVE, pasted the JSON output into a document, and added a logo, you are paying for formatting — not expertise. Good reports interpret and organize the data.
-
No remediation instructions. A report that says "15 images are missing alt text" without showing you which images, on which pages, and what the alt text should describe is only half-finished.
-
No WCAG criteria references. Without specific success criterion mapping, neither your developer nor your attorney can use the report effectively. Every finding needs a WCAG reference.
-
Hundreds of pages of unorganized data. Volume is not value. A 300-page report that lists every individual instance without grouping, prioritization, or context is harder to use than a 20-page report that organizes findings into actionable steps.
-
No distinction between your code and third-party code. If the report blames you for violations in an embedded Yelp widget or a Shopify platform component, it is not accounting for the reality of modern web development. You need to know what you can fix and what requires vendor intervention.
Related Resources
For more on ADA website accessibility, explore these guides:
- ADA Website Compliance: What Every Small Business Owner Needs to Know in 2026 — The foundational guide to ADA requirements, legal risk, and practical steps.
- ADA Compliance Audit: What It Covers, What It Costs, and How to Get One — Understand the audit process, pricing, automated vs. manual approaches, and how to choose a provider.
- ADA Website Audit: How to Test Your Site for Accessibility Issues — A hands-on guide to testing your site yourself using free tools and manual checks.
About ADA Audit Report
Our reports at ADA Audit Report are built to be resolution toolkits — not data dumps. Every report includes an executive summary in plain English, a complete violation inventory with severity ratings, WCAG criteria mapping, before-and-after code examples, third-party issue separation with vendor email templates, page-by-page results, and a prioritized remediation plan. We use axe-core — the same engine trusted by Microsoft, Google, and U.S. government agencies — and our full report is $49.
We are a technology company based in Los Angeles, California. We built the report we wished existed when we first started helping small businesses with accessibility — one that a business owner can understand and a developer can immediately act on.
See what your report would look like
Start with a free scan to see your site's top accessibility issues. Get the full report for $49.
Run a Free Scan