Process curl requests in que

Let me explain you the problem:

I have a site with multiple buttons and on click of each button I have to pass some information in another server. Please keep in mind that I won’t be able to use jQuery here. My approach is mentioned below:

  • On click of these button I will call a JS function which will trigger an ajax call to an internal php page. From that php page I will invoke curl to send data in other server.

Now at a time there can be more than 1000 users who are clicking different buttons same time. I have two questions

  1. Can curl process hundreds of these requests without any problem? Do I need to make any special changes in the curl requests

  2. Considering very large no. of requests is there any other approach that I can follow?

Thanks in advance.
Nilanjan

i would set up a VPN between the two servers and work a way to post the data to the 2nd server with out curl cause i can imagine that many requests will lag your server.

Regarding another approach to follow; is there a need to have the button presses recorded by the second server instantaneously? You could stack the records up and send them across in batches every X minutes, for example.

Thanks StarLion. But how to stack records and send them across in batches using php?

Make your ajax-invoked php scripts write data to a file or a database and then set up a cron job (every x minutes) to invoke another php script that will read your saved data and send it to the remote server in one go.

In that case it will increase load on DB also due to so much interaction

Well, if you want to store so much data from frequent user interactions then you will increase the load on something, anyway. An insert to the db will always be faster than sending the data by curl to a remote server - at least you can be sure (if all is working properly) that your script will finish fast whereas if there is network congestion your simultaneous curl requests can stack up resulting in many concurrent php scripts active and this way you are at risk of exceeding some server limits and in the worst case even bringing your site down when you reach the maximum allowed number of concurrent php requests.

With the db you have some options of tuning it so as to minimize the overhead of your inserts (like using myisam, tuning innodb for faster but less safe inserts or using memory engine). Or simply save your data to a file in any format you choose (for example, each insert being a separate line of json-encoded data) - this should perform very well.

With these methods you will not experience any problems if the remote server doesn’t respond temporarily for any reason - the cron job will try sending the data again in a few minutes and that’s it.