You ran a free accessibility scan. It found eight issues. You had your developer fix them. You are compliant now, right?

Not quite.

Every honest accessibility tool documents the same limit. Automated scanners catch roughly 30 to 40 percent of the barriers that block real users with disabilities. That is not a weakness of any single product. It is a structural reality of how accessibility is defined.

Here is what automation sees, what it does not see, and what you can do about the gap without paying five figures for a manual audit.

What automation catches

Open-source engines like axe-core power most scanners you have heard of. They read your rendered HTML and check a fixed set of rules against it. They are good at a specific kind of failure.

They catch missing alt text, color contrast below the minimum ratio, form inputs without labels, ARIA attributes used incorrectly, images that pretend to be links with no accessible name, and viewport settings that disable zoom. These are the violations that appear on nearly every scan because they come from markup that is either there or not there.

This kind of failure is real. It blocks real users. And it is the bulk of what plaintiff's attorneys cite when they file website accessibility lawsuits. Automation is not useless. It is foundational.

What automation misses

Automation cannot tell you whether a custom date picker works with a keyboard. It cannot tell you whether your checkout flow is completable without a mouse. It cannot tell you whether the error message your form shows is announced to a screen reader when the user submits. It cannot tell you whether your video has captions that actually match the audio instead of auto-generated word salad.

It cannot tell you whether your modal traps focus the way it should or releases it to the background when closed. It cannot tell you whether the 25 product images you label "product photo" convey the same information to a blind shopper as they do to a sighted one.

The common thread: anything that requires understanding intent, sequence, or equivalence is outside automation. A scanner can tell you an image has alt text. It cannot tell you the alt text is wrong.

The three gaps that matter most

Not all gaps have equal consequences. These three carry the most exposure:

Keyboard traps. If a user tabs into a widget and cannot tab out, the entire site becomes unusable. Automation flags some causes. It does not flag the behavior. A human must press Tab and see what happens.

Screen reader equivalence. Sighted users see a page. Screen reader users hear one. When the two experiences diverge meaningfully, the site fails. Automation can verify labels exist. It cannot verify that the label accurately describes what the element does.

Flow completability. Can a person who cannot use a mouse complete your contact form, your booking flow, your checkout? If the answer is no, it does not matter how many CSS rules pass. A site that cannot be used is not compliant in any meaningful sense.

Why the gap still matters if you are not getting sued

Website accessibility lawsuits filed in federal court passed 4,600 in 2025 alone. The overwhelming target: small and mid-sized businesses. Most cases settle for $5,000 to $25,000. The cost is not just the settlement. It is the discovery, the remediation deadline, the public complaint, the insurance call.

But even if you are never sued, the gap matters. One in four U.S. adults has a disability. Some of them are already visiting your site. Some of them are trying to buy from you, book you, or hire you, and bouncing because they cannot complete the task. Automation misses exactly the barriers that cost you those customers.

How to close the gap without the five-figure audit

You have three options. The first is free. The second is $49. The third is expensive.

Option one: DIY spot-checks. Unplug your mouse. Try to navigate your homepage, your contact form, and your most important conversion page using only Tab, Shift-Tab, Enter, and Escape. If you cannot, neither can a real keyboard user. This takes twenty minutes and catches the highest-stakes failures automation misses.

Option two: a real audit report. Automated scanners are the foundation, but the interpretation layer matters. A report that tells you which failures are template-level versus one-off, which originate from vendor code versus your own, which require manual verification before a developer touches them, and which come with before/after code pairs your developer can copy — that is the gap between "here are 94 errors" and "here is what to do Monday morning." Our $49 audit report does exactly this. View a sample before you spend a dollar.

Option three: manual testing with assistive technology. An accessibility specialist sits with your site for several hours using VoiceOver, NVDA, and keyboard only. They find the equivalence failures no scanner can. This costs $2,000 to $8,000 and is the gold standard. It is also overkill for most small businesses until after the automated findings are resolved and documented.

Most small businesses do option one immediately and option two before they consider option three. The documented $49 report is stronger evidence of good-faith compliance effort than any free scan printout, and it is the artifact most small-business attorneys point to when a demand letter arrives.

The bottom line

Free tools are useful. They are also incomplete. Treating a green result from axe-core as compliance is the single most expensive misunderstanding a small business owner can make about ADA. Use the free tool to learn what you have. Use a real audit to learn what to do. Keep a keyboard-only spot-check in your launch routine forever.

Find out what your site actually has

Run a free scan in 60 seconds. No signup, no credit card. Then decide whether you want the $49 report with fixes for your developer.

Run Free Scan View Sample Report →