How is e-mail input type handled?

I’m curious to know how compatible browsers validate the data in an input text box with its type set to “e-mail”.

Do they just look for the most basic of patterns: e.g. any characters split by an ampersat and at least one stop? Or do they prevent what their developers believe are invalid characters?

I don’t believe any browser does any validation of input values. The few browsers that support HTML5 input types just use them as hints for UI stuff, like displaying a keyboard with a @ and . sign on it on a phone.

Apparently Opera, Firefox and Chrome will all tell you if what’s typed into an email input is not in a valid email format, as mentioned on this page:

Web Forms - Dive Into HTML5

The page points out that it’s a complex function, but apparently the browser is able to do it.

Thanks, Ralph.

That’s what would put me off using this input type. If a browser actually applies a set of rules and decides what is and isn’t valid, I can see problems with that. I’m using a very loose regular expression because of the wide range of valid characters in e-mail addresses (my own sometimes gets rejected by web forms), so I don’t want any current or future browser to get in the way of valid data just for the sake of a clever input type.

Yes, it amazes me how much regular expressions for email addresses can vary. Some are huge, to account for a range of less common addresses/extensions. It would be interesting to see what criteria the browsers use to make a decision.

Yes, good question. Here it says that the pattern attribute

Specifies a regular expression against which a UA is meant to check the value of the control represented by its element.

… which seems to imply that the pattern overrides the browser default. But what if the pattern is less specific than the browser?

That of course also raises the question of if you specified your own regular expression in the pattern attribute - whould that override whatever is built into the browser or would both regular expressions be applied?

Answer is: both. (At least for Mac Safari 5.1 and Mac FF 6.0.2 - latest versions as of this writing.)

A basic email field:
<input type=“email” id=“email” name=“email” required>
requires @ to be valid (using the browser’s built-in validation).

Now add a pattern requiring a minimum of 5 characters:
<input type=“email” id=“email” name=“email” required pattern=“\S{5,}”>
and you have to have at least 5 characters, one of which has to be the @ symbol (due to the browser’s validation). So @**, @ or **@ will be valid, because they satisfy both conditions.

NOW, if you noticed, the built-in verification only required @ to be valid, without the usual .** (for instance .ca, .com, .museum, etc.) Not sure how/why/where that would be considered a valid email?!?!

So, can we trust a browser’s built-in verification? Or do we need to add a pattern such as:
pattern=“\b[A-Za-z0-9._%±]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,6}\b”
to force the usual @.** format?