Preventing direct access to included files

Hi,

I have done some research and found a way to prevent access to included files using the below code:

if (count(get_included_files()) == 1) exit('Access denied.');

or

if (count(get_included_files()) == 1) header('Location: /* home page or login page */');

Is this a good or secure way to do prevent direct access to files that are not meant to be accessed directly such as PHP script files? Do you have any better alternative methods?

You don’t stipulate whether you do this or not, but keep them out of (above) your document root.

/var/www/includes/sensitive.php
/var/www/html/www-root/websiteA.com/index.php
/var/www/html/www-root/websiteB.com/index.php

Name them .php files (not .inc files, like we used to see in the past) so they get run through the parser in case you become susceptible to some directory traversal attack.

I remember using a similar system. From memory, in pseudocode the first line was something to the tune of:


if( $_SERVER['SCRIPT_NAME'] === self) die();

When I started using OOP it became a pain to start every class file off with that line, and I never suffered (touches wood) from directory traversal attacks, so dropped the idea.

Thanks for the suggestion. What I am working on is a simple application that does a certain task with some sort of admin panel which requires login. This will be distributed. So, I can’t ask everyone to put those folders above the root. That’s why I am trying to find a solution within the system itself. You can think of it like a CMS, but a very, very simple one.

I have seen various libraries use checking for a constant that is defined in some common script and then use this at the beginning of every file:


if (!defined('MY_CMS_NAME')) exit('Access denied.');

I like this method because it also disallows including the files from scripts outside the library they are part of therefore increasing security by a tiny little bit.

That’s funny, me too.

I’m just trying to work out how to have the core admin files (there are only 4 in a folder called /edit) live in a central place so that server X can have n amount of websites but only sharing one sole set of edit files.

I may post my problem on the SP server setup forum if I cannot find a ready solution, its going to be something to do with a rewrite rule somewhere, but I’m getting dizzy trying to work out if that should be in the httpd.conf file or elsewhere …

As the intention of the code is to stop the file being directly accessed you’d just need that one statement at the top of each file. The OOP code would come after it and would not need to test for that individually.

You could always just add a list of allowed IPs to the PHP file(s) and use the $_SERVER[‘REMOTE_ADDR’] variable to check it against.

this code isn’t mine, but actually hope it can works to you.

just put this code to every script that people can access.

define('access',1);

otherwise, use this script to script that you won’t let people access it directly


if(!defined('access'))
{
    echo 'You cannot access this file directly.';
    exit;
}

That’s the same thing [B][COLOR=#0071d8]Lemon Juice[/COLOR][/B] said.

The idea to declare a constant at your entry point and testing for it and dying if it hasn’t been set was previously mentioned. This is the method used by many scripts. It is effective even if checking for a defined constant at the top of every PHP file is a pain.

Have you considered putting the included files in their own folder (below public_html) and then putting an htaccess file in there with a “deny from all” statement to deny direct access to the included files? Then your scripts could access the files but direct access would be blocked by the server (if you are using Apache). Normally, I try to keep all my important scripts above public_html. But for any includes below public_html, I check for a defined constant and block access with htaccess.

and what about the good old .htaccess file?

Thanks for the pre-defined constant tip, but why would you want to display an “Access denied” message if that is a critical file? Wouldn’t it be better to redirect the visitor (or the bad guy who is looking for a gap in your system) to the home page or login page?

By the way, I got that idea I used in my OP from the following page:

http://stackoverflow.com/a/409738

Some comments say that it is superior to checking a constant, that’s why I wanted to ask here for your opinions too.

I actually don’t want to use htaccess if I don’t have to, because I don’t want to make the assumption of all the users will be on Apache servers.

What makes you think .htaccess only works on Apache?
It’s true that Apache allows a lot more nifty features, but blocking access is a standard .htaccess thing and AFAIK works on IIS also.

Never heard that, do you have a source to confirm this?

This is just an example, do whatever you want when the check is false. I think the best idea would be to display a standard “404 page not found” page that you have for non-existent pages, without a redirect (you would then do an internal redirect with include “page404.html”; ). This way an attacker would not be able to tell if he targetet a real file or not. But don’t forget about setting the 404 response code in that case.

I wouldn’t say it’s superior, it’s just more convenient because you have all the code in one place without a need to define anything. However, I like the constant better because it not only checks if the file is included but also if it was included from the proper context, that is after initializing the library/framework in question.

Thank you very much for the clarification.

Let’s take a step back here and look at the bigger picture. How much of an issue is this? These somewhat convoluted solutions are all well and good but the simplest method is just using properly formatted code in the first place.

If all your code is inside classes or even functions, even if your first line of defence (keeping them outside the web directory) fails, even if the script is executed it can not do anything detrimental to your server. No code will actually get processed other than a class/function being defined.

The whole idea of adding lines of code to each file is ridiculous because even if the file is run in isolation it should not be possible to have it cause any problems on the server or reveal anything useful to an attacker.

Beyond that, it limits the script’s reusability and makes testing it difficult as well as moving it to a different project - your new project needs to remember to define that constant. What it the files for that project define its own constant? Messy. The crux of the issue is a poor separation of concerns. The script itself should not be governing its own access rights.

TomB, you may have a good point here. I’ve never used such techniques but thought they might be useful for distributed libraries. But you are right - there is no real threat with leaving php files with classes as they are. Someone might want to use this to prevent others from guessing file names by entering them in a web browser but that’s something I’d describe as security through obscurity.

Thank you a lot for this additional info. Actually, I haven’t been using classes, perhaps I should revise my scripts and create a class system as soon as possible.