SolidMITOriginal

Dead Link Crawler

Crawls a live site for 4xx/5xx responses — reports broken internal links and rotted outbound links.

Entry verified April 20, 2026

The short answer

The classic site-hygiene check. Outbound links rot at ~10%/year. Internal links break when you rename routes. Run quarterly.

When to use it

Quarterly. After a URL structure change. Before a full-site audit.

Setup

  1. 1

    Save as ~/.claude/commands/dead-links.md.

  2. 2

    Uses a headless fetch + simple crawl.

  3. 3

    Invoke: /dead-links <site>.

Example

You: /dead-links promptwritingstudio.com
Claude: Crawled 184 pages, 1,247 links. 3 broken. All 3 are outbound (rotted referenced articles). Fix list attached.

Source & attribution

Author
Bryan Collins
Licence
MIT
Type
Original

Original pattern published under MIT — attribution preserved by convention, not licence requirement.

Caveats

Respect robots.txt on outbound crawls. Heavy on big sites — rate-limit.

Browse more skills

47 skills across 10 categories, all licence-verified.