VI

Web Development

Screaming Frog SEO Spider v20.2

The Screaming Frog SEO Spider is a website crawler that helps you improve onsite SEO, by extracting data & auditing for common SEO issues. Download & crawl 500 URLs for free, or buy a licence to remove the limit & access advanced features.

What can you...

Description

The Screaming Frog SEO Spider is a website crawler that helps you improve onsite SEO, by extracting data & auditing for common SEO issues. Download & crawl 500 URLs for free, or buy a licence to remove the limit & access advanced features.

What can you do with the SEO Spider Tool?

The SEO Spider is a powerful and flexible site crawler, able to crawl both small and very large websites efficiently, while allowing you to analyse the results in real-time. It gathers key onsite data to allow SEOs to make informed decisions.

Find Broken Links

Crawl a website instantly and find broken links (404s) and server errors. Bulk export the errors and source URLs to fix, or send to a developer.

Analyse Page Titles & Meta Data

Analyse page titles and meta descriptions during a crawl and identify those that are too long, short, missing, or duplicated across your site.

Extract Data with XPath

Collect any data from the HTML of a web page using CSS Path, XPath or regex. This might include social meta tags, additional headings, prices, SKUs or more!

Generate XML Sitemaps

Quickly create XML Sitemaps and Image XML Sitemaps, with advanced configuration over URLs to include, last modified, priority and change frequency.

Crawl jаvascript Websites

Render web pages using the integrated Chromium WRS to crawl dynamic, jаvascript rich websites and frameworks, such as Angular, React and Vue.js.

Audit Redirects

Find temporary and permanent redirects, identify redirect chains and loops, or upload a list of URLs to audit in a site migration.

Discover Duplicate Content

Discover exact and near duplicate content, duplicated elements such as page titles, descriptions or headings, and find low content pages.

Review Robots & Directives

View URLs blocked by robots.txt, meta robots or X-Robots-Tag directives such as ‘noindex’ or ‘nofollow’, as well as canonicals and rel=“next” and rel=“prev”.

Integrate with GA, GSC & PSI

Connect to the Google Analytics, Search Console and PageSpeed Insights APIs and fetch user and performance data for all URLs in a crawl for greater insight.

Visualise Site Architecture

Evaluate internal linking and URL structure using interactive crawl and directory force-directed diagrams and tree graph site visualisations.

Additional Information