403 crawl error in Google WMT

Anyone know why this might be and how to fix?

Cllent site is: terrilonglandscape.com

Google Webmaster Tools is showing a “403 crawl error (access denied) on the blog page: http://www.terrilonglandscape.com/blog/

“Googlebot couldn’t crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site.”

Not sure why. Doesn’t require a login to view it. I am just taking this site over for a client. Not sure why the robots.txt file has this but don’t think that’s blocking the blog:

sitemap: http://cdn.attracta.com/sitemap/503698.xml.gz

sitemap: http://cdn.attracta.com/sitemap/503696.xml.gz

Any ideas how to fix? I posted in Google WMT forum but nothing.

Thanks in advance!

You have this meta tag at the top of your page, which sends Google away:

<meta name="robots" content="[COLOR="#FF0000"]noindex[/COLOR], nofollow, [COLOR="#FF0000"]noarchive[/COLOR], noodp, noydir" />

Wow, thanks Ralph! Didn’t even think to check that! Weird that they’re there. Client uses the Thesis Theme and all three of those meta tags are unchecked. Have no idea how they are getting on there.

Ralph, I see this meta tag:
<meta name=“robots” content=“noodp, noydir” />

But not the one you are seeing:
<meta name=“robots” content=“noindex, nofollow, noarchive, noodp, noydir” />

Hm, weird, I just copied that from one of the pages, but don’t see it now. Sorry for the confusion. Is there any chance those options were only chosen recently? Just a thought.

Hmmm…I’m not sure. That’s weird. Do you know what else it could be?

Check into if you have a plugin installed that denies (forbids = 403) crawlers access