Google bot cannot access my site

Robots.text file is correct but google bot cannot access my site.
Google has given message server connectivity is timeout or block by google.
Google has given another message DNS problem but host company has said, “there is no problem DNS.”
I want a good solution.

Thanks

Your robots.txt file reads:

User-agent: *
Disallow:

I believe that is blocking Google. It disallows all robots. Nornally “disallow:” would have a / after it to disallow any crawling of your site, but it appears that it’s not needed.

What was the purpose of putting this code in your robots file? What were you hoping to achieve? At the moment, you probably won’t be crawled by search engines, though of course naughty spam bots don’t care al all what you put in a robots file.