The first time I used Screaming Frog’s SEO Spider, I admit I felt overwhelmed. I kept getting lost in all the information, tabs, tables, and numbers. Multiple times I asked myself: “Where am I supposed to be looking?” I felt like I was missing out on valuable information and insights because I didn’t understand how to use the application.
For the last few months, I’ve been making it a focus to become better with the Screaming Frog SEO Spider tool, and it has made a huge difference in my website audits and work. For those who are new and are still trying to get the hang of Screaming Frog, I’ve derived a guide on how I’ve used this powerful SEO tool to analyze and decipher a website. My goal is to help reduce confusion and chaos and increase productivity and confidence for first-time and beginning Screaming Frog users.
So where to begin?
Before you touch anything, I suggest you familiarize yourself with Screaming Frog’s settings and menu items. To help with this, I’m going to do a quick rundown of the main navigation menus and settings that I use:
Under the File menu, you can save a crawl as a file so you can always have the crawl data on hand. If you forgot to save a crawl, you can access the last six crawls that you performed in Screaming Frog from this menu. Additionally, if you want to set any configuration settings as a default in the application, you can set that up here.
Personally, this is probably the most important menu to know because here you set up all of your crawl settings. Click on “Spider” to customize which content you want to be crawled, and data you want to see. You can choose to remove or include images, Java, subdomains, CSS, and other specific types of content in your crawl. Click “Include & Exclude” from the menu to paste specific URLs that you want to include or exclude in the crawl. If you want to interlace your crawl with Google Analytics or Google Search Console, click on “API Access.”
Want a quick way to export your crawl data? From the Bulk Export menu, you’ll be able to choose to export addresses with specific response codes (like 401s), directives, inlinks, anchor text, images, and more.
From the Reports menu, you can download an overview of your crawl as well as reports on data such as redirect chains and canonical errors.
Last but not least, use this menu to construct a sitemap for your site.
Now that you’ve got a basic understanding of all the menu options, we can move on to using Screaming Frog SEO Spider for your crawls. Simply copy and paste the website you wish to crawl in the bar on the top of the screen and click “Start.” The spider will then crawl your site and give you your data.
I’ve Crawled My Website. What Now?
Before we dive deeper into Screaming Frog, I want to let you in on a few pointers that I had no clue about when I began using Screaming Frog. These tips were life-changing for me as a beginner, and I wish I had known them sooner:
You Have Three Windows
I don’t know if it’s just me, but when I started using Screaming Frog I felt a little loss. At first I only focused on the main window. However, there is a column to your right that can help you navigate not only to other SEO elements but filters as well. Another window is right below your main window at the bottom of the screen. It took me months to realize it was there! If you select a page in the main window, this pane and these tabs will give you more information and details about that specific page.
You Can Change The Window Size
Sometimes I want to look at the information on the bottom window, but there’s not enough space to see it all. If you want to make any of your windows bigger or smaller, just hover in between two windows until the arrow appears and drag until you are satisfied with the size. Resize your windows so you can easily view the data you want to see.
You Can Drag and Sort Columns
The first time I pulled up Screaming Frog SEO Spider I was overwhelmed with information. How was I going to organize and understand all of this data? Lucky for me, I found out that you can drag columns in your main window and place them wherever you want on the report. Simply click and drag a column to where you want to place it. After a crawl, I drag all the columns and information I don’t want far away from the URL and drag the more important columns I like to see closer by the URL. This allows me to not have to scroll as much. Additionally, if you want to sort your columns by number just click on the column and it will sort the numbers for you from highest to lowest. Click again to sort from the lowest to highest number.
Screaming Frog SEO Spider is capable of giving you pages and pages of data. Now it’s time to look at it and use its features to analyze your website. Here is a list 12 different ways I use Screaming Frog to inspect and optimize a site.
How to Use Screaming Frog (Beginners Guide)
- Check Response Codes
- Check Your URL Structures
- Get Correct Page Titles
- Analyze Meta Descriptions
- Preview & Test SERP Snippets
- Inspect H1 Titles
- Optimize Images
- Check Directives
- Analyze Crawl Depth
- Look at Response Time
- Find Pages With Thin Content
- Generate an Insecure Content Report
1. Checking Response Codes
Near the top of the screen, click on the grey tab labeled “Response Codes.” This will give us all the response codes to all the URLs of the crawled website. To make things easier, click on the “Status Code” column to organize the URLs by status code number.
From here you can check the redirects and make sure that the correct response code is being implemented on each page. A 301 response code is a permanent redirect while a 302 applies when it’s only a temporary one (which is rarely used).
After you double check your 301 redirects, it’s time to move on to the addresses that returned 404 error response codes. A 404 response code means that the URL is linking to a non-existent page. Typically an error message will pop up when it’s clicked on. To solve this issue, identify all the pages with 404 error codes and 301 redirect them to a relevant page.
But wait – what about all the pages that are linking to the 404 pages? Lucky for you, with the Response Code window open you can also check for broken internal and external links. Select a URL with a 4XX or 5XX code. In the bottom window click on the “Inlinks” and “Outlinks” tab to view a list of addresses that link to the broken URL.
If you want to export all of the redirects or 401 errors, near the top of the screen there’s a filter dropdown menu. Click on the menu of what you would like to export, and then click on the “Export” button to the side of it to get your data.
Updating Response Codes can be quite tedious. However, if you update all the pages with redirects and error codes, your user experience will greatly improve (and your rankings will too).
2. Check Your URL Structures
Website addresses that contain unusual parameters and characters are hard to crawl and rank in search results. Additionally, confusing URLs lead to poor user experience and will make it harder for people to easily navigate throughout your website. To make your website easier to crawl and navigate, check your URLs’ structure.
Go to the “URI” tab on Screaming Frog SEO Spider and look at your list of addresses. As you go through each address make sure that it is:
- a reasonable length (about four to five words)
- descriptive (no weird characters or punctuation marks in the parameters)
- unique (no duplicates)
3. Correct Page Titles
Next, we will move to the “Page Title” tab. Once you click on the tab, you’ll see all your page titles as well as your title pixel length and word count. As you go through each page title, you’ll want to look for a few things (you can use the filter at the top to easily find page titles that could be a problem).
There may be pages with the same page title. Make the correct changes so that each and every page title on your site is unique.
Are keywords being optimized in the page title? Ensure that each page title contains your most important keywords.
Use the “Character Count,” column to ensure that each title is around 50-65 characters long. Too long and it will be cut off in the SERPs. Too short and it will not be engaging to your audience.
4. Analyze Meta Descriptions
Similarly, we need to check the website’s meta descriptions and ensure that they are fully optimized. Check for and change meta descriptions that have duplicate content, are not keyword optimized, or are missing. Keep in mind the character count should be around 160 characters. If you’d like to filter and check for specific issues (for example missing meta descriptions), use the filter drop-down menu.
5. Preview & Test SERP Snippets
Did you know that you can preview what your SERP snippets look like? It took me a while to find out that you could do this in Screaming Frog SEO Spider. Simply click on any link address you wish to preview and then in the lower window near the bottom of your screen click “SERP Snippet.” From here you can see your SERP Snippet as well as a data table that includes title and description length.
If you don’t like what you see, you can reset the meta title and description and preview what it will look like. Once you are satisfied, export your new meta titles and descriptions in a spreadsheet for quick implementation.
6. Inspect H1 Titles
Under the next tab labeled “H1” you’ll quickly be able to see which H1s are missing on your site. Be sure to also check for duplicates and ensure that all your H1 titles are specific and unique. Optimizing your H1s within your site will greatly improve your keyword rankings. As you are going through your H1s, ask yourself the following questions:
- Is there only one H1 on each page?
- Do all H1s describe the topic of each page?
- Are all H1s 20-70 characters?
- Do your H1s encourage good user experience?
- Is each H1 keyword optimized?
If you’ve answered yes to all of those questions, you’re ready to move on. If not, you have a little more work to do.
7. Optimize Images
You might be asking, “Why do I have to check images?” The answer is that images can take up a lot of bandwidth and can slow your pages’ load time. One rule of thumb is to keep your images under 100kb. With Screaming Frog SEO Spider, you can check to see which images are slowing down your site.
To do this, click on the “Images” tab in Screaming Frog SEO Spider. You’ll then see a list of all your images on your website. You can also see their size as well as how many inboound links each has. To simplify your data, go to the filter on the top-left corner and click “over 100 kb” to view every image that is over 100 kb. Now you can go through these images and see if you can either compress them to a smaller size or replace them with a smaller image.
8. Check Directives
One way to check for technical issues is to go to the “Directives” tab. Click on the filter drop-down menu on the top-left-corner and choose which option you’d like to see. From the filter menu you can view web addresses within your site that contain canonical, no canonical, index, noindex, follow, nofollow, and other properties. This is a great way to check to see how the robots.txt is being implemented on your website. You might need to make adjustments to what is being indexed, followed, and used as a canonical.
9. Analyze Crawl Depth
Now it’s time to return back to the “Internal” tab (the first tab on Screaming Frog). First, let’s look at the website’s crawl depth. A URL’s Crawl Depth is the number of clicks it takes to get from the homepage to that specific page. The Crawl Depth will vary and depend on how big the website is. Keep in mind that the more clicks a Crawl Depth has, the harder it will be for traffic to visit your webpage, and the harder it will be for the page to be crawled and ranked. If you have an important page on your website (like a product) that has a Crawl Depth of 4, you might want to consider changing the website’s organization and hierarchy so that page has a Crawl Depth of 2.
To check your Crawl Depth, click on the bottom scroller in the main window and scroll to the right until you see the column labeled “Crawl Depth.” You can then drag that column closer to the URL so you can see the address and the Crawl Depth data at the same time. You can click on the Crawl Depth column itself to sort by number. As you look at your addresses, look for important pages that have too high of a Crawl Depth number and take note of them in your website audit or project.
10. Look at Response Time
Next, let’s quickly look to see the response time of each page on your website. This is a fast and easy way to identify pages that are loading too slow. If you have a slow webpage, it will take longer to be crawled. Maile Ohye from Google has said that “2 seconds is the threshold for an ecommerce site…[but] Google aims for half a second.” A page that loads longer than 2 seconds can hurt user experience and slow down crawlers.
In the same “Internal” tab, use the bottom scroller and scroll to the right until you see the “Response Time” column. Again, feel free to drag it closer to the URL addresses column. Look at each of your page’s Response Time to see how fast they are loading. Again, if a page is loading for longer than 2 seconds, you’ll want to take a closer look at it.
11. Find Pages With Thin Content
Google has stated that it wants to rank high-quality content. An in-depth page with a lot of descriptive, useful information substantially performs better than a page with only a few vague paragraphs. Pages with thin content struggle to rank because keywords can only be implemented a few times and can’t be fully optimized.
With the “Internal” tab still open, scroll the bottom scroller until you see the “Word Count” column. By clicking on the Word Count column you can sort your pages from having the least words to the most. As you go through the column, ask yourself the following questions:
- Are there important pages that are only a couple hundred of words that need expanding?
- Is the content on each page keyword optimized?
- Are there any pages that might have too high of a word count? (One example could be a contact page that has 7000 words)
12. Generate an Insecure Content Report
The last feature that I want to talk about on Screaming Frog SEO Spider is the Insecure Content Report. With this report, you can check how secure and safe your website is from hackers and predators.
Within Screaming Frog you will also want to make sure that each page doesn’t include HTTP in the actual URL. Instead, ensure sure each of your website pages are HTTPS, which stands for Hypertext Protocol Secure. HTTPS sites are safer and will keep your information protected from hackers, while HTTP sites will be more susceptible to being compromised and hacked into.
Well, no. Screaming Frog has many more tools and capabilities that you can learn about. You can even connect Screaming Frog to Google Analytics and Google Search Console for an even more powerful experience. But, hopefully, this helps you have a better basis and understanding of how Screaming Frog SEO Spider helps you in your SEO efforts. When utilized correctly, Screaming Frog can become a powerful tool in deciphering a website.
Which Screaming Frog SEO Spider feature was your favorite? Are there any features that weren’t covered that you like to use? I’d love to hear your comments!