Webmaster tools cannot fetch my index page

Hi,

My site has been up for over two years. However when I try to fetch it in Google web master tools it fails each time. I have tried three times now. Can anyone give me an ideas to why this maybe and if it is something I should be concerned about?

thanks

Have you made any changes to the index page recently? If not, then it could be an issue with Google. If no changes have been made recently, I would wait a couple of days and see if the issue persists.

Does your website need to be re-validated with google?

I have tried to fetch it three times over a period of three months.

Maybe. How do i know it does and how do I re-validate?

Webmaster Tools should tell you if you need to re-validate your site.

The most common culprit for Google not being able to access a page is a misconfigured robots.txt file.
What site are you having trouble with?

Hi there is nothing there to suggest it needs re-validating.

This site I am trying to fetch is http://www.web-writer-articles.co.uk.

The robost txt file says (I believe):

User-agent: *
Disallow: /insert.php

At the bottom of the page in webmaster tools it says:

URLs Specify the URLs and user-agents to test against.
http://www.web-writer-articles.co.uk

Huh… What exact error message does Google give you?

The error occurs when I am trying to fetch the index page as Google in webmaster tools. Usually when it has fetched it you see a green tick and the word success. When it tries it goes on foreever and then eventually there is a red cross and the word failed.

I’ve actually had this problem pop up on numerous sites in the past 4-weeks in Webmaster Tools, and it seems like a lot of other people are having the same problem. This morning I had a site with a WMT warning of “Severe health issues are found on your site. - Check site health” even though robots contained about what you have going on (It’s a WP setup, so there was a disallow on /wp-admin and such).

I grabbed a non-functional robots.txt file from another site with no such warnings and it fixed the problem… I think this is a problem Google has right now, not necessarily your site. Here’s the commented out code I replaced:

[INDENT]# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file

#
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# Disallow: /[/INDENT]

That said, what specifically are you trying to do by fetching as Google? It seems like your site is indexed, so it doesn’t look like Google is having problems finding you.

I took the robots page down. Then i tried to fetch again. Again it failed.

The reason I am using fetch is because recently i updated the home page so for quickness I wanted to re-submit the home page.

I think You have re look in to your site, just check your Robots.txt file or meta tags, Just Look of Index & Follow tags. Then Try to request google for index. If the problem is persist, then try to Contact Google or Browse there support Forum.

Hi,

Just to let people know that I have now successfully submitted the index page. It happened after I took down the robots page. Why this should be, I don’t know but it may help others whoi come across the same problem.

regards and thanks for the ideas,