What are pros and cons of using [R=301,NE,NC,L] instead of just [R=301]?

I was considering using this handy little 301 redirect creation program, but do not understand why it puts [R=301,NE,NC,L] at the end.

What are the advantages to including NE, NC, and L ?

They are described this way: "the URL should not be escaped (NE), it should not be case sensitive (NC) and it is the last rule that needs to be processed (L) "

GB,

I prefer to use [R=301,L] for the following reason(s):

R=301 tells visitors (SE’s et al) that the redirection is permanent and displays the new link in the browsers’ location box.

L tells Apache to stop processing the current pass through mod_rewrite and start the next pass with the new {REQUEST_URI}

IMHO, those are the valuable flags (unless you do not want the actual URI to be displayed).

NE is an oddball which prevents encoding of special characters in the URI. I can’t imagine why this would ever be used so, IMHO, it’s not necessary.

NC specifies that mod_rewrite should NOT care about the case of the characters in the regex. This is valuable when examining the {HTTP_HOST} but can cause UnExPeCtEd and UnDeSiReD results when dealing with URIs, i.e., it’s something to be avoided UNLESS you’re using a RewriteCond to test the contents of the {HTTP_HOST} string.

You might benefit from reading the mod_rewrite tutorial linked in my signature as it contains explanations and sample code. It’s helped may members and should help you, too. There is a section on mod_rewrite flags which, while not all inclusive, does address the most common as well as their uses.

Regards,

DK

Nice explanation DK,

Your right that defaulting to NC is probably not the best thing to do. I will add the option to switch it on or off and have it default to off.

From recollection the NE was so I could get it to work properly with some special characters. My tool does the encoding which means I can control things better.

p.s. It is my tool :wink:

Tigg,

Thanks for the kudos, Tigg.

I’ve created a tool, too, which is accessible from my /seo script. Try it and compare the results. When you can explain the differences (and decide which is better), you’ll be great at mod_rewrite.

Regards,

DK

Some interesting stuff in your tool. I’d say a user really has to understand what they are doing though. I see options that could easily cause problems. e.g It looks your bot blocker would block Google from indexing the website, and it’s on by default? And the default for hotlink protection looks like it would block all requests including html ones where the referrer will not be from the domain.

I’ve always found www redirectors don’t quite do the full solution. I like your solution as it is generic and you have workable solutions for both directions. The only weakness is it forces http so if the original request was https it would switch. II think this issue could be solved by using HTTPS as a condition. e.g.

# remove www. in {HTTP_HOST}
RewriteCond %{HTTP_HOST} ^www\\.([a-z.]+)$ [NC]
RewriteCond %{HTTPS} =off
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L]

RewriteCond %{HTTP_HOST} ^www\\.([a-z.]+)$ [NC]
RewriteCond %{HTTPS} =on
RewriteRule .? https://%1%{REQUEST_URI} [R=301,L]

Also, Can’t domains contain more than a to z and a dot? How about numbers? I’d be tempted to just use (.*)

When I get time I’ll have to learn a bit more about the other features of your tool. I’ve personally not dug much deeper than the basics need to sort out SEO issues.

Tigg,

Thanks for the feedback on the mod_rewrite code generator.

Personally, I loathe code generators as they generally miss the obvious (like retaining the secure link, adding digits in a domain name using Unicode in URLs, etc.) but I had hoped that my statement about modifying for use in a particular situation would cover that because the myriad of exceptions would make the generated code too complex for novices to understand.

Regards,

DK

Hi dklynn and Tiggerito , I started this thread a few weeks back, and have learned a bit from your conversation here. Thanks!

But I still am not confident in my htaccess knowledge, and would appreciate any help you can provide with this odd problem.

We are having a weird situation with our site www.easydigging.com (we always use the www version) when trying to set it up as a campaign at Moz.com for SEO monitoring. We can easily set up a campaign for the root easydigging.com but when we try to set up one for the www version the www always gets mysteriously stripped off. The tech support guys at Moz can not figure it out.

We do have the domain registration separate from the webhosting and use A Records to make the connection. Since it is an ecommerce site, we also have a few https pages for the cart and checkout.

Here is the first part of my htaccess file. There are quite a few more 301s for individual pages, but all are like this example. Please take a look and let me know if it is set up correctly…

RewriteOptions inherit
RewriteEngine on
RewriteCond %{HTTP_HOST} !^$
RewriteCond %{HTTP_HOST} !^www\\. [NC]
RewriteCond %{HTTPS}s ^on(s)|
RewriteRule ^ http%1://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

# 301 rule for home page

RewriteCond %{HTTP_HOST} ^easydigging\\.com$ [OR]
RewriteCond %{HTTP_HOST} ^www\\.easydigging\\.com$
RewriteRule ^index\\.html$ "http\\:\\/\\/www\\.easydigging\\.com\\/" [R=301,L]

# 301 rules for Hoss pages

RewriteCond %{HTTP_HOST} ^easydigging\\.com$ [OR]
RewriteCond %{HTTP_HOST} ^www\\.easydigging\\.com$
RewriteRule ^Garden_Cultivator\\/wheel_hoe_push_plow\\.html$ "http\\:\\/\\/www\\.easydigging\\.com\\/wheel\\-hoe\\-hoss\\.html" [R=301,L]

Greg,

I’m glad that you’ve learned … but hope that you’ll read the tutorial linked in my signature because it’s helped many members and should explain things for you, too.

There are things with your mod_rewrite code which need attending to because your DNS records did not resolve your problem.

RewriteOptions inherit
RewriteEngine on


#[COLOR="#FF0000"] force www; maintain https if required[/COLOR]
RewriteCond %{HTTP_HOST} !^$
RewriteCond %{HTTP_HOST} !^www\\. [NC]
RewriteCond %{HTTPS}s ^on(s)|
RewriteRule ^ http%1://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

# 301 rule for home page; [COLOR="#FF0000"]strip URI for index.html[/COLOR]
RewriteCond %{HTTP_HOST} ^easydigging\\.com$ [OR]
RewriteCond %{HTTP_HOST} ^www\\.easydigging\\.com$
RewriteRule ^index\\.html$ "http[COLOR="#FF0000"]\\[/COLOR]:[COLOR="#FF0000"]\\[/COLOR]/[COLOR="#FF0000"]\\[/COLOR]/www\\.easydigging\\.com[COLOR="#FF0000"]\\[/COLOR]/" [R=301,L]
# don't escape characters in the redirection

# 301 rules for Hoss pages
RewriteCond %{HTTP_HOST} ^easydigging\\.com$ [OR]
RewriteCond %{HTTP_HOST} ^www\\.easydigging\\.com$
RewriteRule ^Garden_Cultivator[COLOR="#FF0000"]\\[/COLOR]/wheel_hoe_push_plow\\.html$ "http[COLOR="#FF0000"]\\[/COLOR]:[COLOR="#FF0000"]\\[/COLOR]/[COLOR="#FF0000"]\\[/COLOR]/www\\.easydigging\\.com[COLOR="#FF0000"]\\[/COLOR]/wheel[COLOR="#FF0000"]\\[/COLOR]-hoe[COLOR="#FF0000"]\\[/COLOR]-hoss\\.html" [R=301,L]
# [COLOR="#FF0000"]ditto[/COLOR]

As for a “specification,” it looks like you’re doing three things:

  1. Force www on domain - nicely done but I don’t believe that the empty {HTTP_HOST} needs to be considered.

  2. Strip DirectoryIndex from the URL - personally, I don’t like this as it defeats Apache - and may generate endless redirections if your host has setup to force the DirectoryIndex to be shown.

  3. Strip Garden_Cultivator from wheel-hoe-push-plow.html - a Redirect 301 could handle this easier.

Regards,

DK