Getting pages corrected and reindexed in Google

We’ve all done it. You upload a new page or new content to your site and there is a mistake in it. The headline includes a terrible typo, or you missed a zero of the promotional price banner. In the past, all you could do was wait for Google to respider and then reindex the page. Depending on the the site and how deep into it the page was, it could take some time. All very difficult.

Google’s Webmaster Tools has now added a natty new tool feature overcomes this. You’ll find it under ‘Diganostics’ and then ‘Fetch as Googlebot’.

Fetch as Googlebot has been around for a while. It allows you to ask Googlebot to come and spider a page. Enter the URL, click fetch and the page will be listed. Simply click ‘success’ when the page has been spidered and you will see the code that Google has read. The tool was initially intended to allow webmasters to check to see if their sites had been hacked, but it is also useful as it shows the header status code (showing any redirects etc) and the download time, an increasingly important factor used to determine rankings.

The new feature is that you can now submit the page to Google meaning that the new page or content beats the queue for getting spidered.

You can go further and ask Google to respider and reindex all the pages that the corrected page links to you.

There is a limit on the use of Fetch as Googlebot. Matts Cutts in his video on this says you have 50 fetches per week and 10 linked pages submissions. But I have a WMT account with about 30 domains on it and that is allowing me 500 fetches, but across all the domains of course. I am still limited to 10 linked page submissions though.

This tool has been around for about six months, but if you missed it’s very useful to know about for when next drama happens.

Google’s Matt Cutts describes the process in more detail


Comments are closed.