If you’re desperately trying to improve your SEO rankings in Google, to no avail, there’s always the possibility that your website is not being properly indexed, which essentially means Google is having trouble accessing your web pages in order to properly index your site’s content.
Firstly, don’t worry; if you suspect this problem is affecting you, we’re going to tell you what you can do about it.
Check Google Webmaster Tools
The easiest way to check whether your site is being properly crawled and indexed by Google is to log into Google Webmaster Tools. From there, you’ll be able to check the Google Index tab, which will give you the total number of pages Google has indexed. If there’s been a decline in the number of pages, it could be down to 5xx server error codes, or a low connect time – two technical signals Google recently revealed they use to determine when to temporarily ‘back off’ from a website – and is probably why you’re seeing a drop in your traffic levels.
Other crawler errors
To check for other reasons why Googlebots may have backed off, this is again a job for Google Webmaster Tools. Head to your dashboard, and from there check your crawler error messages. You’re most likely to discover a 404 HTTP status code warning, stating that one or more urls cannot be found. Other errors can include:
– 5xx server error codes
– Meta tags
– DNS or connectivity errors
– URL parameters
– Inherited issues
Syntax errors or structural issues
Is your website new?
You need at least one inbound link for your website to be successfully indexed with Google, which is why it can a while for new websites to be indexed. If that’s the case, try not to worry.
Remember, these checks are relatively easy to carry out and ultimately fix (as long as you haven’t been penalised by Google for some reason), so don’t let simple errors affect your rankings and ensure your website is being indexed efficiently by Google.