I have two php applications that communicate with each other with a REST api, by using curl. Both applications are hosted on the same server.
My question is: Does it not give overhead to use curl for communication on the same server? Is it perhaps better to create a custom php function for this purpose?
k thanx buhbye.
Couldn't disagree with you more there.
Using include allows execution of code that is hosted remotely. If the file you're including from another server gets compromised, then you can consider this server compromised too. You're also relying on allow_url_include being set to on, which you shouldn't have on if you can avoid it.
Any kind of DNS attack could also expose include() to acquire a different remote file than the remote file you wanted.
fsockopen or curl would be your best bet. I don't think there's much between the two performance wise.
Its on the same server. Aka., file paths.
Besides, even if they were in different servers, you don't need URLs to share files between them.
Exactly - just copy the code you want to the second server in that event.
barry is right about remote file inclusion - very bad practice that. But that isn't what I was recommending. I was being a little snarky about it yesterday though so, meh.
Thing is, we are talking two applications. Likely referencing the same database. Writing a function to get the data shared should be trivial. If the two are using two different frameworks things get trickier. Since the OP is considering cURL for this problem my gut tells me he's outsmarting himself and making things way more complicated than they need to be. So Keep It Simple and Stupid. It will run faster that way.
Yep. Totally ignore me. I missed the "same server" bit.
include() will do the trick
If for any reason you do decide each application needs it's own server, then my response starts to make a bit more sense...
This topic is now closed. New replies are no longer allowed.