, ,

How to Identify Website Errors and Fix Those Problems

Today, your website is the beating heart of your brand. If your consumers are facing website errors and poor user experience, it’s more than likely they will look elsewhere. According to Statistia, website crashes and websites timeout are just some of the primary reasons customers abandon their carts. If your site takes too long to load or features easy-to-fix issues, you are, effectively, throwing money away. The first step to improving website optimisation is to identify your website errors.

What are the common website errors?

It’s lucky that you ask, as this is where we come in. Resolving website errors is not only one of our top SEO techniques but will, subsequently, increase conversions. There’s nothing quite like a conversion killer than a poor website.

Google Search Console

In terms of finding any website errors, by far the most useful tool is Google Search Console. Recently, the Google crawler has also updated their appearance, and is currently in beta mode. The beta features are limited, but it’s a sure sign that Google is investing in the accuracy and improving the usefulness of the platform.

The Google Search Console dashboard reflects how Google views your website. It analyses the traffic arriving at your website and, subsequently, any website errors that may need addressing.

Google Search Console

Crawl Errors

Website crawl errors are the issues Google has encountered when looking through your website and indexing pages. The errors which are reported are split up into the following.

Crawl errors

DNS Errors

DNS (Domain Name System) errors are critical for your site. If the Googlebot is experiencing DNS issues, it cannot connect to your domain via a DNS timeout issue or DNS lookup issue. Luckily, these are very rare, but if you do face these errors, you must immediately contact your in-house IT team or web hosting company. A DNS issue is extremely important, as it’s the first step for your consumers to access your website. You should always take swift action if you’re running into DNS errors that prevent Google from connecting to your site.

Server Connectivity

A server error, typically, means your server is taking too long to respond, so the request times out. The Googlebot is a busy robot and can only wait so long to access your site. If it cannot do so within that timeframe, it, quite literally, gives up. Unlike the above issues, this refers to a problem with your server. For example, your server is down or running slow. A one-off issue may not be a cause for concern, but repeated issues is cause for concern.

Robots.txt Fetch

Essentially, the Googlebot cannot retrieve your robots.txt file, located at [yourdomain.com]/robots.txt. However, this is a very straightforward fix (finally). You can use a robots.text generator, or even have your developer create one and upload it to the root of your server. You may have already had the file previously, but it is no longer accessible, so you will need to look into this further.

Soft 404 Error

In simple terms, a soft 404 error refers to a page displaying as 200 (found) when it should display as 404 (not found). To resolve this Google crawler error, you will need to look at the headers sent back from the server, ensuring a 404 code is sent rather than 200.

404 Not Found Error

A 404 not found error means the Googlebot attempted to crawl a page that does not exist on your site. The Googlebot finds 404 pages when other sites or pages link to that non-existent page. However, unless these are critical pages for your website, you can ignore these messages. Google states they do not harm your rankings, but if they are high traffic pages, then you may want to redirect.

Crawl Rate

Google Search Console will detail the rate at which they are crawling your website. You do not need to fixate in the crawl rate, but watch out for spikes. If there is a large drop on the crawl rate, it could be due to:

  • A newly added robots.txt rule that is too broad, stopping google crawling a large area of the website
  • Broken HTML or unsupported content on your pages
  • If your site is responding slowly to requests or the error rate increases, Googlebot will throttle back its requests to avoid overloading your server
  • You have lowered your maximum crawl rate

Crawl rate

An increase in the crawl rate can almost certainly be attributed to new content on your website, or a change which may result in pages being dynamically created. For example, if you have set up product filters and dynamic pages have now been created, subsequently increasing the crawl rate.

Sitemap

Another form of SEO optimisation is identifying sitemap errors. You’ll be pleased to hear these are easy to resolve. The standard errors are links that are no longer accessible, which you can correct with a dynamically created sitemap index, ensuring all old URLs are removed when no longer accessible. It’s worth remembering that when you make significant updates/changes to your website, submit your sitemap to Google to give them a gentle nudge to crawl your pages.

Sitemap error

If you do not have a sitemap index, you could face a problem when attempting to pinpoint which pages are not being indexed. For example, if 10,000 pages are contained within one sitemap file, and a large number of URLs, have not been indexed – you face tough problems in resolving those issues. However, a sitemap index is, essentially, a directory, with smaller sitemap files, usually grouped by product categories.

URL Parameters

We’ll touch on the URL parameters, but they are one of the most complex website errors. You could end up causing more damage than good to your SEO optimisation without solid technical knowledge, so it’s best left to your developers. The URL parameters tool can be used to indicate how Google should treat specific URLs. For example, e-commerce sites may tell Google you use the country parameter to distinguish between consumers in different countries. The preferences you set encourage Google to crawl the preferred version of your URL, preventing Google from crawling duplicate content on your site.

URL Parameters

When it comes to the SEO techniques mentioned above, we suggest you:

  • Check Google Search console each week to catch any website errors
  • Create custom sitemaps to identify indexing issues
  • Use URL Parameters to prevent crawling useless pages

While Google Search Console is extremely beneficial, you can also use Rage Sitemap and Screaming Frog to identify any website errors and, subsequently, improve website optimisation.

You can get in touch with our team regarding the SEO techniques mentioned above, and we’ll provide you with plenty of advice (and a smile or two).

Alan Ruddick
Alan Ruddick

Alan is a full-stack developer with over 7 years commercial experience. Drawing on his experience with projects ranging from Wordpress blogs to bespoke enterprise platforms, Alan prides himself in being able to tackle projects of any size.

Ready to rule the world?