Protect Files and Folders via PHP

Hello !

I am working on a script for my company.

The requirement is that only the authenticated IPs can access certain areas of our site (through browsing / application).

Our site has about 78 folders and have many files in each. I have built an admin control panel to add IPs and grant access to IPs on ALL or selected folders, and this is all saved in a database.

Now, I want to know what is the best way out to implement what I have done ? Means, how to enforce only the allowed IPs to access ?

I think if my script Updates/Creates .htaccess file in each folder and get the allowed IPs from database and write them to .htaccess file, will that solve my case ? Is there any other better approach ?

I want the same thing for the search engine bots, so shall my script create a robot file for each folder ?

Thanks & Best Regards
Zee

Note that the bots can still read the files in the directory. Most search engines will respect your request to not index a directory. Who knows if they store it or not.

Protected directories should be above the public level of your site. A php script loads the document and sends it to the browser if the user is authenticated.

When you fetch the IP’s that are allowed from your database, store it’s values in an array. Then check wheter the current IP is in that array, using in_array. If not, redirect/deny access. Example:

$ips = array();

// Fetch from database
while($data = mysql_fetch_assoc($results)) {
    $ips[] = $data['ip'];
}

if(!in_array($_SERVER['REMOTE_ADDR'], $ips)) {
    // Refuse access
}

As for the files/folder indexing by robots, deny all access by default, unless a user is authenticated. So make sure your check script is always used as the gateway to these files/folders.

@dsmIT: Why pull all records out of the database when you can just pull out a matching record with the same IP? That way if there is no record returned then you’ll know its not in the DB.

No need to pull them all out and put into an array, that just wastes CPU cycles.

Hi !

Thanks a lot for all your replies. But why not doing it via .htaccess ? That i mentioned in my post ?

Actually, the solutions that you people have suggested so far (i guess) is good if a user trying to access via browser. But what if a user is trying to access via some application ? or FTP etc?

You cannot control that via php. PHP is a language that runs on the server and is typically executed as an apache module with a request for a page.

If you’re looking to restrict the actual access to directories via FTP or file sharing then PHP and web server technology won’t help you.

Ah ! so even if I manage to create a .htaccess file with all the IPs that are allowed to see a folder, it can not be done ?

Oh it can be done.

PHP can write a .htaccess file as easily as it can write a .txt file. In fact, they are identical processes.


file_put_contents(".htaccess","deny from all");

Note: Obviously, this method of implementation needs to be very highly sanitized/validated, as it can lock out everyone quite easily. It also only updates when the file is rewritten, and only comes into effect on the next pageload.

For apache yes. I don’t know of any webserver that uses .htaccess files though. I don’t think samba or windows filesharing take any notice of .htaccess either.

So the bottom line is that, we can not control users coming from some Application / FTP to our server through PHP and PHP written .htaccess file ?

No.

PHP via website access yes but for FTP or file sharing no PHP won’t do the job.

Well, I know that PHP wont do the job for FTP etc, but this is not PHP. PHP is only to read/write .htaccess file and nothing else.

Now, the question is will .htaccess file work in case of FTP etc ?

No, No, No, No, No, and No!

Unless you use a FTP server which (as I keep saying) specifically uses .htaccess files then NO it will not work with FTP. The same also applies for Filesharing like samba / windows file sharing.

Only the apache webserver will use the .htaccess file.

Would you like me to repeat that again?

Lolz ! No ok thanks !

Yeah, i’d like you to repeat it again. It is… theoretically, possible.

Instead of just writing a .htaccess file, you’d have to be writing the FTP daemon’s user/permissions files, or else executing the console commands to do so, although this is not recommended.

A bad idea? probably. Impossible? no, not really.

Thats a very long shot and its certainly not going to work via the .htaccess file.

If its doable then do feel free to provide code rather than just theorise it and give the op hope.

Would depend entirely upon which FTPDaemon they’re using, what configuration mode it’s in (Is it set to use the linux users list as it’s userlist? Does it have it’s own list? Where is that list? How does it store it’s passwords? Does the apache user have access to those files? Is PHP configured to allow system calls?)

Simple statement: PHP can manipulate any file it has access to.

Thats the point. We all know that but its the rest of what you’re saying is possible that the user asked about WITH .htaccess files.

.htaccess files are for apache. Yes you can write / edit other files but that wasn’t what the user was asking hence my answers.

It would be nice if we could perhaps resolve this for the user though - maybe he’ll tell us more about his FTP system?

Hi !

I have to get more info for FTP. I really appreciate StarLion who at least enlighten a way to have this solution. I had an idea, that PHP can update any file it has access to and I had a little / vague idea that even FTP etc can be controlled by creating some file on server.

Anyways, guys, thanks a lot to provide help on this, and StarLion, thanks a lot for giving me another way.

Cheers !