File/Folder permissions & search engine indexing/security

I see where file permissions are best set at 644 and 755 but for the purposes of not having
the major search engines index my main folders displaying all pages, which G has done – what about 744?

Which gives ‘User’ all priviledges and ‘Group’/‘World’ read only.

Also, why not turn ‘execute’ off for all users?

Obviously I don’t want users able to download my php.

File permissions and search engines indexing your site are not hand in hand. Changing one, does not affect the other. You really need to look at implementing a robots.txt file if you want to stop search engines from indexing your site.

After a couple of my sites were hacked, I set all file permissions on all my sites to 404 and all directory permissions to 505. (They’re all static sites; this will not be appropriate everywhere, but it works for me.) The only drawback with this approach is that I need to remember to change the permissions on the directories before I try uploading to them. :wink:

If you’re talking about folders containing files with sensitive information that you want only your script to read (eg. lib, conf) the best thing would be placing them outside of the root.

You should turn off indexes, at least for those folders.

I don’t know if your site is WordPress, but this Codex page might help explain. http://codex.wordpress.org/Changing_File_Permissions

It’s not wordpress.

How do you turn off indexes without stopping the crawl of the pages? (‘noindex’, etc)

Some of these pages do have forms on them that output to the same page.

I’m not talking about search engine “indexing”, but where if a folder doesn’t have a default “index” file (i.e. index.php, index.html, home.html) Apache will display an index of the folder contents.

I see.

So to stop the indexing of the individual folder pages I need an index.html for each folder.

Some folders, one very important one in particular, has a regular page being used as the index for that folder but without naming it “/index.html”

Where all pages within that folder are linked out of there.

  • Wonder if Google and the others will recognize it as the index for that folder, without it having the ‘proper tag’

Only other thing I think at this point is to do a redirect through htaccess file since other sites have already linked to the important page without the index tag.

Best I seem to be able to come up with is to set folders to 744 and its pages within to 644
Although of course problem remains of each and every page link showing when entering the folder address in the url bar.

Sorry for the DP (edit function expired) but it occurred to me that I could also do a 404 error on each of the folder addresses.

.htm/folder/

If I can do it without affecting the folder pages within. Does this take a rewrite condition?

Unless you want Apache to show folder contents you can put this in your htaccess folder(s)

Options -Indexes

That way if someone goes to a folder without specifying a file and there is no “index” they will get a blank page.

If you don’t want a blank page something like this could work

Redirect permanent /folder/$		http://www.your-domain.com/folder/folders-main-page.php

Also a good idea to have

RewriteRule ^\\.htaccess$ - [F]

The first works nice (although I don’t think I’d be able to do a single htm.index.htm with this and have it work on other folder addresses) … so I’ll actually have to make my mind up.

Can’t seem get the 2nd to work - it doesn’t seem to be recognizing the ‘folder’

not sure what the 3rd is for.

:d’oh: my mistake, mixing up redirect with rewrite :blush:
It should be

RewriteRule (.*)folder/$ $1folder/folders-main-page.php [L]

The last is extra insurance nobody will be able to read the htaccess file