How To Identify Site Issues Using Search Console

How To Identify Site Issues Using Search Console

How Do I Fix Crawl Errors in Google Search Console?

Site IssueSearch console is a very important to me as an SEO. I use it to mostly to investigate and fix issues involving traffic and ranking drops.

While ranking reports and analytics are very useful, but when you combine that data with search console data you can build a more complete picture of the potential issues.

In this article I’ll show you how I use search console to identify the most common issues I run across and how I solve them. It’s time to put on your detective hat and investigate where the issues lie. At the very least, you’ll be able to come up with a timeline where “issues” began. You can then take the data back to your team and start looking into possible causes from a website (coding, design or backend) standpoint.

Important:

ImportantBefore we begin I want to make this clear. You must verify the correct domain in search console. If your site resolves to http://www.abc.com do not verify http://abc.com Google will only provide data on the version of the site you verify. The same goes for subdomains and mobile urls. You must verify those separately as well.

However, if your canonical urls are not set up properly you may be getting traffic from both secure and non secure or www and non www urls.

If traffic seems low to you (compared to analytics) –  besure you verify that all possible url combinations point to the proper “final” url. Also, verify that the redirects are 301 and not 302, java redirects or meta redirects. You can use this chrome extension to see the types of redirects in place.

Error Reports:

404 Errors: What are they are why they can be problematic?

A 404 status means that a page cannot be found (broken) this could be due to recent site changes, database issue etc. Broken pages aren’t inherently bad and all sites have them. However, this becomes a problem if there is a large amount of these pages and/or they are increasing.

Search Console 404 error report

How to Fix 404 Pages?

Search console will allow you to export a list of the recent 1,000 pages. Go through the list and try to identify the reasons why these pages are broken. If the pages that longer exist such as old products, former employees or service pages are there similar pages that these pages can be redirected (301 permanent) to. In some cases you can identify a pattern that may be easy to fix.

For example; if you sold blue sneakers model 123 in black and blue, but no longer carry the blue – you could redirect the blue page to the black page.

If a former employee has left your company, you could redirect their old page to the main company directory page.

Perhaps offered a service and recreated a similar page you can redirect the old url to the new one. However, if you no longer offer the service and there is no comparable page then it is okay to leave the page broken.

Do NOT redirect the url(s) to the home page. Redirecting pages to the home page or mass redirecting a large number of broken pages to the home page or other sections of the site can trigger a soft 404. When Google triggers a soft 404 Google is indicating(to you) that the redirects in place appear irrelevant. This will provide a poor user experience and therefore should be corrected.

How to Fix Soft 404 Errors

A large number of soft 404 pages will impact your rankings. Google has stated that there is no penalty for soft 404 errors on your site. However, in my experience a large amount of  soft 404 pages will always impact rankings.  Keep in mind these pages were once ranking (and driving traffic) are now gone from Google as they are now being redirected. Therefore, the old pages will drop from Google and they won’t be driving organic traffic anymore.  Second you’re wasting your site’s crawl budget having Google crawl redirects from pages that no longer exist.

Important tip: If you see a large amount of soft 404s verify that the 404 pages are returning a 404 status. I’ve seen many instances where 404 pages were returning a 200 status and Google will flag those as a soft 404!

Crawl Stats:

Why are crawl stats so important? We want to be able to see how many pages Google is crawling and how long it takes Google to crawl that content.

PAges crawled in search console

If you don’t add a lot of new content the “pages crawled per day” will stay the same. If you see a lot of spikes in this graph you will need to research why google is crawling more pages on some days and less on others. This graph above was from a client that had technical issues and once they were fixed, Google started indexing more pages.

Google search console bytes per day

This section is only important when comparing this to the other two graphs in this section. If the other two have spikes this section should as well. If the other two section are steady – so should this one. This shows how much of your site Google has crawled. If you see large spikes in this section it can be a sign that Google may be crawling sections of the site it shouldn’t be.

This could be due to Google crawling site search results, duplicate content or infinite redirect loops.

Time spent downloading a page

In this section shows how much time Google has spent downloading pages. The lower the number the faster Google is crawling your site. Longer download times can cause indexing issues. The graph above shows a drop-in download time. This was due to adding AMP pages to the site.

Fetch as Google

You can use this tool…

  1. To force Google to crawl a page or the entire site. Useful when updating your site/page after fixing technical issues. Great to use after launching or redesign.
  2. Ensure you’re showing Google and visitors the same version of your site.

Fetch as GoogleIf you ask Google to crawl the site – Google will report back the code that it sees when it crawled the page. This is helpful when identifying the code Google is seeing on the page and the type of redirects in place, if any.

Crawl as Google

If you ask Google to crawl and render Google will show you a visual of how to pages looks to Google and to the visitor.

Fetch and Render Search Console

Here you want to be sure Google is seeing the same “page” as the visitor. If Google is seeing something different then it could be that there is a script that is being blocked from Google. Developers sometimes block CSS files, javascript and other files to improve load times. Doing this will cause rankings to drop within Google if left too long.

If scripts are taking too long to load and / or are being blocked – consider reducing the number of “calls” a script does by minifying the files. I would also consider creating AMP pages to improve load times and performance.

Site Performance

Reviewing the impressions and clicks within search console will help you identify issues. Here is an example of a site that had traffic decrease in Google after a site update.

impressions in Google search console

This data is very valuable as search console will give you dates as to when traffic has dropped. This is valuable information because we can now research what changes were implement to the site before and on the day of the drop. If there were no technical changes done to the site you can start looking at other issues such as a manual or algorithmic penalty such as bad inks, duplicate content etc.

Pages Report

You want to view this section to determine if all of the traffic is going to the home page. In the example below, you can see while impressions are increasing in Google the average position has dropped.

Traffic drop showing in search consoleIf you go through pages and sort by position you can see which pages have high impressions but a low CTR. This is an indication that the content that is ranking high is irrelevant for the search term it is ranking for. You need to investigate as to what could have happened at the time the positioning dropped.

CTR

So, what happened? The client actually changed the site architecture and ranking for pages dropped. While impressions from Google were increasing to do increased pages in the sitemap rankings dropped for many key terms.

Why did the navigation impact rankings? Google needs to be able to crawl the entire site in 3 clicks or less. Visitors should not need to hit a back button. Your site should rely on a search box only. Google can only crawl links, the harder it is for Google to find the links the less likely those pages will rank.

Search Queries

search query

What keywords are showing in the “search query” section? Desired keywords? Branded terms? If only branded terms show up in this section it could mean:

  1. You have been hit with a penalty
  2. Your site isn’t optimized for keywords or…
  3. The keywords you’re targeting are ranking poorly
  4. Interior pages could be blocking Google from either the robots.txt file or no,index tag on individual pages.

Most of the time sites aren’t optimized properly for their targeted terms. If the site ranked for key terms before and now only ranks for branded terms, I’d check number 4 above before assuming it is a penalty.

Did you find this article useful? If so, please share it. If you still need help identifying site issues contact me!