So, having gone through all the links shown in GWT and removed those that I can that I don't believe are beneficial, I go back to GWT and they are still there. Weeks and weeks go by, and they won't go away. My understanding is that unless the sites where the links are coming from are updated, they won't get recrawled. I haven't had an unnatural links warning, so I can't notify Google that these links are gone via a reconsideration request.
Google encourages webmasters to look at where there links are coming from, but there doesn't seem to be any machinery in place to let them know the work that has been done. Unless I am missing something?
I don't think that can be correct. How will Google know whether the sites have been updated unless it crawls them to find out? :)
My understanding of that is that they can check the last modified date, is that not correct? and if there is no change, no need to crawl? Either way, the pages cannot have been crawled or the non-existent links would not still be showing?
I don't whether or not the Google bot checks the last-modified date. But if the links have been removed, surely the last-modified date will have changed, so it will do the re-crawl in any case.
You say the links are still there after "weeks and weeks". How many weeks exactly? It can take quite a long time for Google's index to be updated.
Also, are you sure the links have in fact been removed? Is it possible that they have been re-posted?
This is not by any means something that is unique to my site. But unless someone has been actively trying to remove links, there is no real reason why they should even be aware of this problem. In my case the links were removed up to 6 weeks ago, but others have problems with links that have been gone for much longer (from discussions on other forums). Those of mine have not been re-posted.
It is the very fact of the time it takes for Google's index to be updated that I am querying. Ignoring for the moment the possible reasons why a site hasn't been re-crawled,
what I am really looking for is a way to prompt such action, or to notify Google that the links are gone.
Why don't you try re-submitting your URL to GWT so that Google can reconsider your site for indexing the pages? I think that would do. And yes, there could be another reason that the sites from which you are getting links are not updated so the obvious reason could be that they aren't crawled yet.
I had a site hacked a while back and a large number of URLs added to it. I removed the added files and submitted a reconsideration request to Google. They responded quite quickly, agreeing that there was no longer a problem with the site - but it took them months to stop reporting the removed URLs as 404 errors. It drove me demented. They knew those pages should never have been there, they'd acknowledged that I'd removed them and that this was A Good Thing - then "complained" that they couldn't find them. :rolleyes: My point being that in my experience, Webmster Tools is not always a true reflection of the situation.
My site has not been de-indexed, and has not had a manual penalty, therefore there is nothing for Google to reconsider.
An interesting story, which does highlight the problem. But unless they have a different database they are using other than that shown in Webmaster Tools, SERPs could be influenced either way by outdated data?
This topic is now archived. It is frozen and cannot be changed in any way.