Temporarily disallow crawling?

I’ve done a 503 with a Retry-After: 86400 on the xxx.com and created a dev.xxx.com with a robots.txt of Disallow: /

Will I have a problem with this? Will this effect future SEO? I don’t know much about SEO and my past attempts at it have not been good. :slight_smile: A couple ended up in irreversible situations.

###How can I temporarily suspend all crawling of my website?
You can temporarily suspend all crawling by returning a HTTP result code of 503 for all URLs, including the robots.txt file. The robots.txt file will be retried periodically until it can be accessed again. We do not recommend changing your robots.txt file to disallow crawling.

Edit: I’m really not sure where to put this topic.

how do you do that, return a 503?
is that a specif url you can put in?

I just used a PHP script with a header. It was the easiest way since this is on my PHP box.

<?php
    header('HTTP/1.1 503 Service Temporarily Unavailable');
    header('Status: 503 Service Temporarily Unavailable');
    header('Retry-After: 86400'); // 24hrs in Seconds
?>
<h1>503 - Service Temporarily Unavailable</h1>

You can do it with server configs on Ngnix and Apache, but since I have other things running on that box and nothing in the root, this was just the path of least resistance.

Thank you!