nayen — 2012-11-07T08:08:24-05:00 — #1
I have done some research and found a way to prevent access to included files using the below code:
if (count(get_included_files()) == 1) exit('Access denied.');
if (count(get_included_files()) == 1) header('Location: /* home page or login page */');
Is this a good or secure way to do prevent direct access to files that are not meant to be accessed directly such as PHP script files? Do you have any better alternative methods?
cups — 2012-11-07T09:46:24-05:00 — #2
You don't stipulate whether you do this or not, but keep them out of (above) your document root.
Name them .php files (not .inc files, like we used to see in the past) so they get run through the parser in case you become susceptible to some directory traversal attack.
I remember using a similar system. From memory, in pseudocode the first line was something to the tune of:
if( $_SERVER['SCRIPT_NAME'] === self) die();
When I started using OOP it became a pain to start every class file off with that line, and I never suffered (touches wood) from directory traversal attacks, so dropped the idea.
nayen — 2012-11-07T11:18:49-05:00 — #3
Thanks for the suggestion. What I am working on is a simple application that does a certain task with some sort of admin panel which requires login. This will be distributed. So, I can't ask everyone to put those folders above the root. That's why I am trying to find a solution within the system itself. You can think of it like a CMS, but a very, very simple one.
lemon_juice — 2012-11-07T13:37:38-05:00 — #4
I have seen various libraries use checking for a constant that is defined in some common script and then use this at the beginning of every file:
if (!defined('MY_CMS_NAME')) exit('Access denied.');
I like this method because it also disallows including the files from scripts outside the library they are part of therefore increasing security by a tiny little bit.
cups — 2012-11-07T14:32:51-05:00 — #5
That's funny, me too.
I'm just trying to work out how to have the core admin files (there are only 4 in a folder called /edit) live in a central place so that server X can have n amount of websites but only sharing one sole set of edit files.
I may post my problem on the SP server setup forum if I cannot find a ready solution, its going to be something to do with a rewrite rule somewhere, but I'm getting dizzy trying to work out if that should be in the httpd.conf file or elsewhere ...
felgall — 2012-11-07T16:26:17-05:00 — #6
As the intention of the code is to stop the file being directly accessed you'd just need that one statement at the top of each file. The OOP code would come after it and would not need to test for that individually.
kduv — 2012-11-07T16:44:34-05:00 — #7
You could always just add a list of allowed IPs to the PHP file(s) and use the $SERVER['REMOTEADDR'] variable to check it against.
mildfoam — 2012-11-07T18:29:12-05:00 — #8
this code isn't mine, but actually hope it can works to you.
just put this code to every script that people can access.
otherwise, use this script to script that you won't let people access it directly
echo 'You cannot access this file directly.';
kduv — 2012-11-07T19:34:44-05:00 — #9
That's the same thing Lemon Juice said.
cheesedude — 2012-11-08T02:23:38-05:00 — #10
The idea to declare a constant at your entry point and testing for it and dying if it hasn't been set was previously mentioned. This is the method used by many scripts. It is effective even if checking for a defined constant at the top of every PHP file is a pain.
Have you considered putting the included files in their own folder (below public_html) and then putting an htaccess file in there with a "deny from all" statement to deny direct access to the included files? Then your scripts could access the files but direct access would be blocked by the server (if you are using Apache). Normally, I try to keep all my important scripts above public_html. But for any includes below public_html, I check for a defined constant and block access with htaccess.
pompopom — 2012-11-08T03:05:02-05:00 — #11
and what about the good old .htaccess file?
nayen — 2012-11-08T03:57:00-05:00 — #12
Thanks for the pre-defined constant tip, but why would you want to display an "Access denied" message if that is a critical file? Wouldn't it be better to redirect the visitor (or the bad guy who is looking for a gap in your system) to the home page or login page?
By the way, I got that idea I used in my OP from the following page:
Some comments say that it is superior to checking a constant, that's why I wanted to ask here for your opinions too.
I actually don't want to use htaccess if I don't have to, because I don't want to make the assumption of all the users will be on Apache servers.
pompopom — 2012-11-08T04:26:31-05:00 — #13
What makes you think .htaccess only works on Apache?
It's true that Apache allows a lot more nifty features, but blocking access is a standard .htaccess thing and AFAIK works on IIS also.
nayen — 2012-11-08T04:52:16-05:00 — #14
Never heard that, do you have a source to confirm this?
lemon_juice — 2012-11-08T05:47:50-05:00 — #15
This is just an example, do whatever you want when the check is false. I think the best idea would be to display a standard "404 page not found" page that you have for non-existent pages, without a redirect (you would then do an internal redirect with include "page404.html"; ). This way an attacker would not be able to tell if he targetet a real file or not. But don't forget about setting the 404 response code in that case.
I wouldn't say it's superior, it's just more convenient because you have all the code in one place without a need to define anything. However, I like the constant better because it not only checks if the file is included but also if it was included from the proper context, that is after initializing the library/framework in question.
nayen — 2012-11-08T06:03:49-05:00 — #16
Thank you very much for the clarification.
tomb — 2012-11-08T07:06:35-05:00 — #17
Let's take a step back here and look at the bigger picture. How much of an issue is this? These somewhat convoluted solutions are all well and good but the simplest method is just using properly formatted code in the first place.
If all your code is inside classes or even functions, even if your first line of defence (keeping them outside the web directory) fails, even if the script is executed it can not do anything detrimental to your server. No code will actually get processed other than a class/function being defined.
The whole idea of adding lines of code to each file is ridiculous because even if the file is run in isolation it should not be possible to have it cause any problems on the server or reveal anything useful to an attacker.
Beyond that, it limits the script's reusability and makes testing it difficult as well as moving it to a different project - your new project needs to remember to define that constant. What it the files for that project define its own constant? Messy. The crux of the issue is a poor separation of concerns. The script itself should not be governing its own access rights.
lemon_juice — 2012-11-08T09:39:25-05:00 — #18
TomB, you may have a good point here. I've never used such techniques but thought they might be useful for distributed libraries. But you are right - there is no real threat with leaving php files with classes as they are. Someone might want to use this to prevent others from guessing file names by entering them in a web browser but that's something I'd describe as security through obscurity.
nayen — 2012-11-08T09:55:22-05:00 — #19
Thank you a lot for this additional info. Actually, I haven't been using classes, perhaps I should revise my scripts and create a class system as soon as possible.