Link Depth Analyzer
Find pages buried too deep in your site architecture.
Seed URL
BFS crawl up to 30 pages, max depth 4. Pages at depth ≥ 4 are typically buried for SEO.
Start here · Why link depth matters for SEO
Depth counts clicks from the seed URL along internal links. Important commercial pages buried at depth four or deeper often receive weaker crawl priority and PageRank flow.
This client-side tool runs a bounded BFS crawl through /api/fetch: it stays on the seed host, normalizes URLs by stripping query strings and trailing slashes, and stops after 30 pages or depth 4.
Results bucket URLs by depth and show the via parent so you can trace the shortest internal path that buries a URL.
When to use this tool
- Template audits
See if category or PDP templates strand new URLs at depth ≥4 from the homepage seed.
- After mega-menu changes
Compare before/after crawls from the same seed to confirm navigation fixes shortened paths.
- Subfolder launches
Seed a blog or help subdomain to verify cross-linking from that silo stays shallow.
- Hands-on education
Point students at public sites with predictable nav to visualize BFS depth in real HTML.
Examples
Walk through these with the form above — they are practice scenarios, not live data.
Depth 4 border styling
Try this
Expand the crawl until at least one URL lands in the depth-4 bucket.
What to look for
Those cards pick up rose borders—use them as a checklist for extra internal links or hub navigation.
Progress counter while crawling
Try this
Click Crawl on a large site and watch done · queued update.
What to look for
Loading states help you estimate whether the cap stopped the crawl early.
Short tutorial
Follow in order the first time you use the tool; later you can skip to the step you need.
- Step 1 — Choose a realistic seed
Use the URL you treat as the crawl root—often the homepage or major hub.
- Step 2 — Run the crawl
Wait for completion; widen navigation if the tool exhausts the queue under the page cap.
- Step 3 — Read depth stats
Tiles summarize how many URLs exist at depths zero through four.
- Step 4 — Inspect via paths
Expand the Pages by depth list to see which parent links caused deep placements.
- Step 5 — Feed recommendations elsewhere
Send shallow hubs to internal link recommender and orphans to orphan page finder.
More detail
New here? Skim Start here first, then run one Examples scenario in the form above.
Link Depth Analyzer does one job: find pages buried too deep in your site architecture. It lives under Internal Linking on SEOToolkits, where the beginner idea is simple: Internal linking connects pages on your own site so users and crawlers can find important content.
FAQ
- Why fewer than 30 pages returned?
- The site may have thin internal linking, network errors, or the queue hit the depth limit before discovering more URLs.
- JavaScript-only links?
- The fetch endpoint returns server HTML; client-rendered-only links may be invisible—compare with javascript rendering checker.
- Cross-subdomain paths?
- The crawl restricts to the seed host. Run again with a different seed for other hosts.
- Does depth equal clicks for users?
- It approximates shortest internal graph distance from the seed, not logged analytics paths.
Related tools
Same workflow cluster on SEOToolkits — open another module without leaving context.
Internal Link Analyzer
Map your internal link graph with depth and equity flow.
Orphan Page Finder
Discover indexed pages that no internal link points to.
Internal Link Recommender
Suggest contextual internal links from existing content.
Sitemap Analyzer
Audit XML sitemaps for coverage, errors, and stale entries.