REST API for internal usage

I am currently building a REST API in PHP, for external access to local data.

This gave me the idea, to also use the same REST API for all our local applications for data retrieval. This way we could maintain all database and business logic in the API Core, while other applications focus mostly on presentation. Or do you think this is bad application design, that perhaps creates to much overhead? (using curl request instead of direct access)

Thanks.

Too much overhead, do not use the network for internal communications. Instead use much more readily available means of communications between different process or what have you. This depends on the system its running on of course…Now you can mimic REST if you like but do not use HTTP or the network for internal stuff. Thats like throwing a whale into a mini toilet.

Thanks for the reply. I follow your argument, and I think that you are right.

But there are more and more external applications, such as smartphones apps, that require access to centralized data. How do I best wrap database and business logic, so that it is common to both external and internal applications?

Too much overhead, do not use the network for internal communications. Instead use much more readily available means of communications between different process or what have you. This depends on the system its running on of course…Now you can mimic REST if you like but do not use HTTP or the network for internal stuff. Thats like throwing a whale into a mini toilet.

I disagree. While it is certainly more processing power, a REST-like API provides a clear interface to the application. What happens otherwise, is you have multiple applications using the base system SQL as an API and who, what, where and why another application is calling the base system becomes almost impossible to answer. Limiting the interface to some simple REST-like functionality limits the scope and thus makes future refactoring far far easier.

Beside that, an HTTP request for an internal application does not need all the network overhead associated with a external HTTP request. I believe setting the internal host files to point directly at the server in question would address this issue. As for the performance of a HTTP request in general. Yes it’s one more layer of abstraction on top of an already existing SQL interface, however, you could setup lighttp to handle REST requests and keep things light weight and minimal - almost on par with a direct SQL query.

Cheers,
Alex

Alex, why would you bring HTTP into your application when you can just issue via a faster means like a console input or output streams or any other inter-process communications. Why would you bring in the overhead of an HTTP server when you can use a REST-API like interaction without HTTP or any network usage?

You say using a REST will limit scope and make future refactoring easier, but then you turn around and make the system more complex by putting in a HTTP server to handle request. That does not make it easier in the future. Its one more vector that needs to be tested, one more point of failure. And one more area of unnecessary overhead.

But like I said, use a REST-like API for internal operations but leave the network and HTTP out of the equation. You do not need it, it is unnecessary avoid the overhead that incurs when PHP makes network request (which are god-awful slow).

This may or may not be of any use what so ever, but you could implement an REST API that is then accessed via the CLI for local applications?

Take CodeIgniter for example: Running via the CLI : CodeIgniter User Guide

That sounds like a great idea!