You have linting for your code. Type checking for your TypeScript. E2E tests for your UI. But what about your SEO?
SEO regressions are silent killers. Someone removes a meta description. A deploy breaks the canonical tag. The sitemap gets deleted. Nobody notices until traffic drops weeks later.
The fix: add SEO checks to your CI/CD pipeline. If the SEO score drops below a threshold, the build fails. Just like a broken test.
# Pseudocode
1. Deploy to staging/preview
2. Run SEO audit against the preview URL
3. If score < 85 โ fail the build
4. If score >= 85 โ proceed to production
name: SEO Check
on:
pull_request:
branches: [main]
jobs:
seo-audit:
runs-on: ubuntu-latest
steps:
- name: Wait for preview deploy
# Your preview deployment step here
# e.g., Vercel, Netlify, etc.
- name: Run SEO Audit
env:
SEO_API_KEY: ${{ secrets.SEO_SCORE_API_KEY }}
PREVIEW_URL: ${{ steps.deploy.outputs.url }}
run: |
RESULT=$(curl -s -H "X-API-Key: $SEO_API_KEY" \
"https://seoscoreapi.com/audit?url=$PREVIEW_URL")
SCORE=$(echo $RESULT | jq '.score')
GRADE=$(echo $RESULT | jq -r '.grade')
ISSUES=$(echo $RESULT | jq '.priorities | length')
echo "SEO Score: $SCORE/100 ($GRADE)"
echo "Open issues: $ISSUES"
# Print priorities
echo $RESULT | jq -r '.priorities[] | " [\(.severity)] \(.issue)"'
# Fail if score is below threshold
if [ "$SCORE" -lt 85 ]; then
echo "::error::SEO score $SCORE is below threshold (85)"
exit 1
fi
echo "โ
SEO check passed: $SCORE/100"
seo-audit:
stage: test
image: alpine/curl
script:
- apk add --no-cache jq
- |
RESULT=$(curl -s -H "X-API-Key: $SEO_SCORE_API_KEY" \
"https://seoscoreapi.com/audit?url=$PREVIEW_URL")
SCORE=$(echo $RESULT | jq '.score')
echo "SEO Score: $SCORE/100"
echo $RESULT | jq '.priorities[]'
[ "$SCORE" -ge 85 ] || exit 1
only:
- merge_requests
For more control, use a Python script that you can customize:
#!/usr/bin/env python3
"""SEO quality gate for CI/CD pipelines."""
import json
import os
import sys
import urllib.request
API_KEY = os.environ["SEO_SCORE_API_KEY"]
URL = os.environ.get("PREVIEW_URL", os.environ.get("DEPLOY_URL", ""))
THRESHOLD = int(os.environ.get("SEO_THRESHOLD", "85"))
if not URL:
print("ERROR: Set PREVIEW_URL or DEPLOY_URL")
sys.exit(1)
req = urllib.request.Request(
f"https://seoscoreapi.com/audit?url={URL}",
headers={"X-API-Key": API_KEY},
)
with urllib.request.urlopen(req) as resp:
data = json.load(resp)
score = data["score"]
grade = data["grade"]
priorities = data.get("priorities", [])
print(f"\n{'='*50}")
print(f"SEO AUDIT: {URL}")
print(f"Score: {score}/100 ({grade})")
print(f"{'='*50}")
if priorities:
print(f"\nโ ๏ธ {len(priorities)} issue(s) found:")
for p in priorities:
print(f" [{p['severity'].upper()}] {p['issue']}")
print(f" โ {p['fix']}")
else:
print("\nโ
No issues found!")
print(f"\nThreshold: {THRESHOLD}")
if score < THRESHOLD:
print(f"\nโ FAILED: Score {score} is below threshold {THRESHOLD}")
sys.exit(1)
else:
print(f"\nโ
PASSED: Score {score} meets threshold {THRESHOLD}")
sys.exit(0)
You can also log scores to track trends:
# Add to your CI script
echo "$(date -u +%Y-%m-%dT%H:%M:%SZ),$SCORE,$GRADE,$ISSUES" >> seo-scores.csv
git add seo-scores.csv
git commit -m "SEO score: $SCORE ($GRADE)"
This gives you a CSV history of your SEO score with every deploy. Plot it in a dashboard to catch gradual degradation.
Start with 75 and raise it over time as your team fixes existing issues.
SEO_SCORE_API_KEYSEO_SCORE_API_KEYPro tip: Run SEO checks only on PRs that modify HTML, templates, or content files. Use path filters to avoid burning audits on backend-only changes.