Today I wanted to give an overview of the Fetch as Google tool. On the surface this tool is a way of getting your website rapidly indexed. It also contains a rendering tool which allows for advanced SEO tweaking and debugging.
For blog writers, and eCommerce stores regularly adding pages Fetch as Google is one of the quickest and easiest ways of getting your page listed on Google. As a part of webmaster tools, Fetch as Google allows you to view your website as the ‘Googlebot’ crawler see’s it. Beyond these very useful features Fetch as Google can also be used to analyse your site content. You can use metaphorically use Google’s eyes, allowing for more advanced SEO, explained later in the Blog.
Using ‘Fetch as Google’
To use the tool log in to Google’s Webmaster Tools, and select Crawl>Fetch as Google. Alternatively visit ‘Google-bot Fetch’ directly and select the website you are looking to crawl if there is more than one linked to the account. The tool itself is very easy to use.
To crawl the homepage leave the text-box blank. Alternatively input the URL of the page you want to Crawl. You are allowed to crawl up to 500 web pages per month. If you are just looking to index multiple pages you can submit a whole sub directory by selecting crawl this URL and its direct links, this can be done 10 times a month.
You can then crawl the website to be indexed into Google’s listings by pressing fetch. Alternatively the drop-down box on the left allows you to render the site as the Googlebot would see it, giving you the page view that a desktop computer, or mobile phone would use. This is useful for checking if a page is responsive to different devices – very important for SEO. Just choose the device you want to view and press fetch and render.
This gives you two tabs. ‘Fetching‘ downloads and debugs your website’s code. ‘Rendering‘ gives a view of your site itself as the Googlebot would see it.
Fetch as Google Diagnostics
Fetch as Google also has diagnostics built in:
Complete Fetch
A complete fetch is successful.This means that all the content on the page is readable to Google.
Partial Fetch:
A partial fetch means some content may not be readable by Google. If there is a major fault you will be able to see it on the rendered image. If not the tool will return a list of errors:
Redirect: This is an amber error. The ‘downloaded HTTP response’ in the fetching tab can usually help you identify where the issue is:
If you fail to get a complete or partial Fetch then there are issues on page which could affect users. A redirect is usually bad for SEO and you should either modify the problem on the page or stop the page being indexed with your robots.txt file. Google won’t list the page if it is a straight redirect. If you get a red error then you have a problem which will stop the page being added to Google’s index. These errors are often problematic to the user of your website.
There are a fair few reasons Google might not be able to list your site – ranging from needing authority, or to pass a log in – to incorrect URLs and missing pages. Google has a full diagnostic guide for these errors.
Page Optimisation
So going beyond the diagnostics there is one more hugely useful feature in the rendering tab. Google often talks about the most important part of the page for the user experience being ‘above the fold‘. The fold is any part of a web page that a user would have to scroll down to access. SEO is concentrated in this area.
Google approves of as much information in this area as you can to keep your users from bouncing from the page. On the flip side it is also vital to make the page load fast. Rendering allows you to see how quick your pages are loading, and also see what Google is finding above the fold. Keeping this in mind you can design a page that will quickly engage the user:
- Keep your keywords and a variety multi-media above the fold.
- If you have lots of images at the top of your page consider using a smaller amount of larger more compressed images, or a video.
- Ensure that if you have a main page title Google will see it as above the fold.
- Using the Fetch as Google render tool, ensure if you have a partial crawl the error is below the fold.
Conclusion
If you regularly update your site Google will crawl your pages on its own, but this takes time. If you submit a new sitemap to webmaster tools then Google will reindex your site and start listing your new content within a few days. While Fetch as Google doesn’t guarantee to list your content faster it is a newer process then submitting a sitemap. Personal experience has found that Fetch as Google does seem to index your site quicker.
If you are interested in more advanced aspects of SEO testing then the rendering tool is excellent for testing your pages to ensure you have them as well optimised as possible.
Watch this space for a guide to submitting sitemaps to get crawled.
Good post – I’ve also had some success using Google’s page speed optimisation plugin, faster pages seems to lead to faster web site indexing
Thanks for the comment Adrian, this was one of my very early posts, still relevant. I am now more into search theory although I am working on a pagespeed module walkthrough because it’s very good and wouldn’t surprise me if contributes to ranking factor(s).
Your code is nice and neat.
Thanks, the fact that Google has a page speed tool within their Search Console points to it being important and likely to be more so in the future.
Also if you search for the phrase crawl budget there seems to be a lot of discussion about Google allocating a fix amount of crawler time to each website, so if pages are faster to download that means more pages get crawled in the allocated time.
That would make for an interesting case study – make a some very slow and some very fast sites with lots of pages and see what gets crawled faster.
Google is crawling very fast in general compared to 8 years ago however so count blessings.
Thank you for your input if you have any other ideas I have quite a lot on but I am always down for getting involved in case studies.
All the best
David