A 50-location healthcare practice has hundreds of patient-facing pages: location landing pages, provider profiles, condition pages, appointment booking flows, blog content, insurance pages. Auditing every page manually is impractical — that's a multi-month, six-figure engagement.

The realistic workflow is two weeks of automated scanning plus targeted manual review of the patterns the scan surfaces. Here's how to actually do it.

Day 1: Inventory

You can't audit what you can't list. Start by enumerating:

  1. Location URLs. Every location's main page.
  2. Provider URLs. Every provider profile.
  3. Booking flows. Every URL that initiates appointment booking.
  4. Critical landing pages. Insurance accepted, services, conditions treated.
  5. Forms. Contact, intake, registration, feedback.

For most practices this list is 200–800 URLs. Get it from your CMS export, your sitemap.xml, or a basic crawl.

Day 2: Set up batch scanning

SEO Score API provides a /audit/batch endpoint that audits up to 10 URLs per call. For 500 URLs, that's 50 batch calls — runnable in about 30 minutes total.

Sample workflow in Python:

from seoscoreapi import batch_audit
import os, json

with open("urls.txt") as f:
    URLS = [line.strip() for line in f if line.strip()]

API_KEY = os.environ["SEOSCORE_API_KEY"]
all_results = []

for chunk in (URLS[i:i+10] for i in range(0, len(URLS), 10)):
    result = batch_audit(chunk, api_key=API_KEY)
    all_results.extend(result["results"])

with open("audit_results.json", "w") as f:
    json.dump(all_results, f, indent=2)

Cost: a Pro plan ($39/month, 5,000 audits) covers a 500-URL initial audit plus monthly re-scans with room to spare.

Days 3–5: Pattern analysis

You now have hundreds of URLs each with scores and issue lists. Don't try to read all of them. Aggregate.

Group issues by failure pattern. For each pattern, count how many URLs are affected.

from collections import Counter

pattern_counts = Counter()
for url_result in all_results:
    for category in url_result.get("audit", {}).values():
        for check in category.get("checks", []):
            if check["status"] in ("fail", "warning"):
                pattern_counts[check["name"]] += 1

for pattern, count in pattern_counts.most_common(20):
    pct = 100 * count / len(all_results)
    print(f"{pattern:40s} {count:>4d} URLs ({pct:.0f}%)")

This separates template-level issues (affecting 80%+ of URLs) from per-page issues (affecting one or a few). Template-level issues are higher leverage — one fix, hundreds of URLs improved.

Day 5–7: Triage by impact

For each significant pattern, ask:

  • Severity: is this Level A, AA, or AAA? (A and AA are the priority; AAA is aspirational.)
  • Surface area: how many URLs affected?
  • Fix complexity: CSS change, template change, content change, vendor change?

Build a backlog ranked by severity × surface_area / complexity. The top of the list is what you fix first.

Typical patterns at the top:

  • Color contrast on rate/info tables (CSS, all pages, high impact).
  • Focus indicators removed globally (CSS, all pages, high impact).
  • Form labels missing on a shared component (template, dozens of pages, medium impact).
  • Inaccessible date picker on booking (vendor pressure, all booking flows, medium impact).
  • Provider photo missing alt text (per-page CMS, hundreds of pages, low-medium impact).

Days 8–10: Template-level fixes

Most practices can ship template-level CSS and HTML fixes within a week. Focus areas:

  • Color contrast: update CSS variables.
  • Focus indicators: replace outline: none rules with visible focus styles.
  • Skip links: add a "skip to main content" link as the first focusable element.
  • Form labels: ensure every input has an associated <label> in the template.
  • ARIA: add aria-label to icon-only buttons.

Run a re-scan after the deploy. Most template-level fixes will show immediate improvement across all affected URLs.

Days 10–14: Manual sample audit

Automated scanning catches roughly 30–40% of WCAG issues mechanically. The remaining 60–70% require human review. For a 500-URL site, manually reviewing every URL is impractical; manually reviewing a representative sample is.

Sample 10–20 URLs across:

  • Location pages (3–4 random)
  • Provider pages (3–4 random)
  • Booking flow (every step, end-to-end)
  • Critical content pages (conditions, services)

For each, run keyboard-only and screen-reader tests. Document issues that automated scanning didn't surface. These typically include alt-text quality issues, ambiguous link text, complex form interactions, and dynamic-content announcements.

Day 14: Document the program

Examiners and (if needed) defense counsel want artifacts. Produce:

  1. Audit summary: total URLs scanned, pattern-level findings, severity breakdown.
  2. Remediation log: what was fixed, when, by whom, against which findings.
  3. Manual audit notes: sample URLs, issues found, remediation status.
  4. Ongoing monitoring plan: continuous scanning cadence, who reviews, what triggers escalation.

This document set is what a downstream OCR investigation or ADA defense engagement will ask for. Producing it once and updating quarterly is much easier than producing it under pressure.

Ongoing: continuous monitoring

The two-week audit is a one-time push. The ongoing program is:

  • Weekly: continuous scan of the homepage and a sample of high-traffic pages.
  • Monthly: full-site sweep across all locations.
  • Quarterly: manual audit of a rotating sample of pages.
  • Annually or biannually: external manual audit by an accessibility consultant.

SEO Score API's healthcare vertical supports the weekly and monthly automated layers. You can wire the batch endpoint into a cron job or n8n workflow that emails when scores drop.

What if our 50 locations are on different platforms?

This is the harder case. Multi-CMS multi-location practices typically have wider issue patterns and need per-platform fixes. The audit workflow is the same; the remediation just runs in parallel across platforms.

Consider whether consolidation makes sense. Multi-CMS architectures introduce ongoing accessibility, SEO, and operational costs. Most practices in this state benefit from a 12–18 month consolidation roadmap.

What if we use a third-party patient portal vendor?

Patient portal accessibility is the vendor's responsibility, but you're still legally accountable for the patient experience. Request the vendor's VPAT, document your evaluation, and pressure for remediation if the VPAT shows gaps.

For your scope: scan the public marketing site (which is what plaintiff firms scan). The portal is a separate workstream.

The realistic budget for a 50-location practice

  • Initial 2-week audit: 60–80 hours of internal staff time (analyst + 1–2 developers), or $15K–$30K outsourced.
  • Initial template-level remediation: 80–160 hours of dev time. Highly variable; depends on platform.
  • Continuous monitoring: $39/month (Pro plan).
  • Annual manual audit: $15K–$50K depending on scope.
  • Ongoing remediation labor: 5–15 hours/month after initial cleanup.

Total first-year cost for a defensible accessibility posture: typically $30K–$80K. That's less than a single ADA settlement.

What's the alternative?

The alternative is the status quo: no continuous monitoring, no documented remediation, no manual audit cadence. The expected cost of that posture, integrated over a 5-year horizon for a multi-location healthcare practice, is materially higher than the program cost — and usually arrives all at once in the form of a demand letter or OCR investigation.


Free scan of one of your locations → — start the audit log today.