I have a number of PHP scripts that I wrote that I use to compile information across the web. I have an unique script for each page that I attempt to get information from, however, since I am trying to execute numerous scripts in a single cron task, it looks like the script is erroring out (and not telling me why).
This is my current script that I have running every night:
$file = @file_get_contents('http://site.mine.com/pages/param1');
$file = @file_get_contents('http://site.mine.com/pages/param2');
$file = @file_get_contents('http://site.mine.com/pages/param3');
Each of the three scripts are also executable by just visiting through a browser, but they typically take too long to execute or error 500 out.
What is the correct way to mimic viewing a page through an automated task?
Since you do this in a cron job I'd go for curl or wget. No need to fire up PHP for this.
multiple runs are run independently of each other so if one of them crashes all the others still go through.
/path/to/curl http://site.mine.com/pages/param1 >> command1
/path/to/curl http://site.mine.com/pages/param2 >> command2
/path/to/curl http://site.mine.com/pages/param3 >> command3
It will write the output for each of the "downloads" to the files command1, command2, command3