Is my PHP background task / Cron idea a good way to do this?

I want to run certain tasks in the background and not on user request (e.g. send a transactional email and then based on the response do another action).

If I had a script which added this info to a database:

  • PHP class
  • PHP method
  • parameters to pass to that method
  • etc.

…and then every 5 minutes one Cron Task retrieved all the data from this database and executed those tasks (using class, method and params), is this secure?

It sounds a great idea to me. I could just store details of any class and method (and params) and have it run later.

If there was only one or two tasks then I suppose I could just write a specific script and execute as a Cron Task, but this gives me flexibility to run anything.

It sounds to me like what you are trying to accomplish a Message Queue system. Something along the lines of RabbitMQ or Amazon SQS.

I’ve had a similar task to solve at my company. We allow users to, among other things, generate PDFs on demand to track their progress through our course. When they request a PDF, the web server sends a message to Amazon SQS. Then, on a “backend” server, I have an Upstart job that is monitoring the Amazon SQS. When it sees a new message, it reads it and does whatever it asks.

My job that monitors the Message Queue uses what I suppose you would call the Command Pattern. Each message consists of a json encoded message that is:


 $message = array(
    "command" => "PDFCompletionCertificate",
    "args" => array(
         "name1" => "value1",
         "name2"=>"value2"
    )
);
$message = json_encode($message);

The queue monitoring script would read the message, decode it, and, in this case, instantiate a new PDFCompletionCertificateCommand object then run its PDFCompletionCertificateCommand::setArgs() method passing in the args from the command, and then runs PDFCompletionCertificateCommand::execute() method to execute the command.

Each command is its own separate class that implements a common interface so that the queue monitoring script can reliably run it. Then, adding a new job is simply a matter of creating a new class that implements this interface.

The monitoring job, like I said, is an Upstart job so it runs as a daemon all the time. It checks the queue, sleeps for a few seconds, and then checks again. This way my jobs are run almost as soon as they arrive.