Just read this....place JS code before footer instead of in head tag

Now don’t say OMG where have you been and you call yourself a web designer?!?!?

I am constantly trying to improve my skills and read lots of different books…including a lot from here and different forums and lots of different sites on the subject! I am currently working on a WP site for a client and I am not too familiar with WP yet. It seems pretty straight forward…create the page in HTML and CSS and then cut it up for WP using PHP. However, I am also reading that for SEO and general purposes it is best to put any JS running on the page down before the footer or maybe it was just before the closing body tag instead of at the top in the head tag. Is this really true and how to do others feel about this? If I have an image slider that will be in the header section, will this still be something I should consider doing?

Most front end developers are advocating this practice. Me included!

In theory it speeds up your page by allowing the content to load first and the JS later. I’ve found that in the tests I’ve done, my pages render noticeably faster with JS at the bottom.

There really is no need to have any JS load before the content anyways.

For the full scoop:

Cool thanks for the info. I will place the JS at the bottom of the page that will have the image slider. Good to know this for future sites as well. See you CAN teach an old dog new tricks! :wink:

In terms of actual load times, it does NOT make it faster; it’s a placebo effect because SOME browsers will render the DOM as it becomes available, so you’ll see content rendering sooner… sad part is, it actually takes LONGER because if you put it in HEAD the script can start loading in the background… but that delays how long it takes for the page to start rendering in some browsers.

It’s much akin to how Opera feels slow to some people despite actually finishing loading the page faster… This illusion occurs because Opera doesn’t redraw the window arbitrarily after every piece of information, but instead waits for certain things to finish (DOM complete, all CSS) or for a timer to expire (Tools -> preferences -> advanced -> browsing, under “loading” there’s a SELECT, typically set to ‘every one second’) before attempting a reflow/redraw… so while it’s physically faster, the lack of showing you it’s doing anything makes it feel like it takes longer. (this was more true when Opera’s default for that was every three seconds!). It’s funny in that case because if you set Opera to a high ‘redraw’ delay like every 5 seconds, the page loads significantly faster, but it feels like it takes forever because you don’t actually see it doing anything.

Perception is everything.

Though really, if you have enough script for being in <head> or being right before </body> makes a difference, you probably have too much scripting… like say some idiotic framework like jQuery or mooTools.

There is one REALLY good reason though for running scripts manually right before </body> – the DOM is completely built at that point so if you run any scripts that would change the DOM, those changes can occur before CSS is applied to the page… this reduces the “flash of unstyled content” (aka FOUC) that can occur if you want for “onload” to fire… doing this I usually load the script in head, then manually call a ‘startup’ function right before </body>. Actually loads faster while reducing FOUC.

You’re correct that total load times generally won’t be faster, but the majority of what your users came to see – your content – will be available sooner. That’s a good thing. Plus, let’s not disregard the placebo. If users perceive faster performance, they’ll have a better experience. e.g., http://stevesouders.com/hpws/move-scripts.php

Also, your emphasis on “some” browsers is misleading. The overwhelming majority of the browser market share – IE, Firefox, Chrome and Safari – will render the DOM as it becomes available. Opera may be the only exception, at a mere 2% market share.

Sorry, but this part is flatly untrue. The total load times will be largely the same.

I’m assuming you’re thinking that the time taken to redraw the page is what would make scripts at the bottom slower. But either the process doesn’t play out like you think, or the redraw time just isn’t significant enough to matter, because when you actually measure load times, scripts at the bottom comes out equally fast or faster.

The best JavaScript programmers around – such as those at Mozilla, Yahoo, and Google – advocate JS libraries. So at minimum, you should tone down the inflammatory rhetoric. And if you choose to go a step further… when the best JavaScript programmers disagree with you, it’s probably worthwhile to reconsider your opinion. Make sure you can explain the opposing arguments just as its advocates would.

Why not use the wealth of tools which are available to visually see the results of including stuff in the header or just before </body>

Also use a subdomain for loading particular scripts or images - this has the effect of displaying the page then filling in the blanks.

A very good suggestion.

Scripts at the bottom is both perceptively faster and objectively faster.

@Jeff Mott

Thank you and also for enlightening me as to how to use the image expire. Maybe now I will be able to match your “Perf grad”. Yours is 100% - pure magic :slight_smile:



 <img src="/bin/sleep.cgi?type=gif&sleep=2&expires=-1&last=0&imagenum=2&t=1331922946" height=20>


I wasn’t necessarily saying it was a bad thing… as I said, perception is EVERYTHING… and if that means making it slower while people THINK it’s faster, more power. User feedback is all important, part of why X11 and most *nix desktops running atop it are so uselessly crippled – you can’t even get the cursor to update when a program is loading, so by the time you figure out it actually did start loading something you’ve launched five copies.

Actually FF and Chrome will start trying to download them too – but they won’t process them until everything else is done OR (and here’s the kicker) you call a function from inside them. Moment you call a function the browser doesn’t know, it starts fishing through all the available scripts or those you’ve tried to load.

Depends on how many files you have and their sizes – browsers can download four to eight files from each server referenced on a page – the earlier you start those extra downloads, the sooner you’ll have all the files downloaded.

Though, eys – gecko and trident still have their heads wedged up their backsides about starting those extra downloads as soon as they can – which is why they both are where FOUC is the biggest headache.

Yeah, and they’re pissing all over the Internet in the process with fat bloated slow rubbish sites filled with needlessly cryptic and harder to maintain code. jQuery in particular has reached epic proportions of idiocy in it’s use on just about every site that does so – which is why the Internet IMHO is less useful to me today than it was a decade ago.

From 10k + the library to do what 5k of script without the library could accomplish, to websites with half a megabyte of scripting just to make the page HARDER to use – it’s time for the various little js kiddies out there to be reminded scripting should be used to enhance functionality, not replace it. When we have people vomiting up megabyte sized websites where the images are only a quarter of that just to deliver 2k of plaintext and a dozen content images, it’s time to just say NO to frameworks.

[ot]
This is your site:

This is your site on jQuery:

Any questions?[/ot]

The important distinction in this case, however, is that scripts at the bottom can actually make your page faster. Not just perceptively, but also objectively. This decision should be a no-brainer.

No, it doesn’t depend on any of those things. You’re talking as if the browser will sit around and wait unless the scripts are at the top. That doesn’t make any sense. If your scripts are at the top, then the scripts will download first. And if your scripts are at the bottom, then your images and content will download first. At no point are you delaying downloads.

Several issues here:

  1. You’re putting too much emphasis on file size. The great thing about performance research is that it tells us where the bottlenecks are, and HTTP requests are the biggie. It’s pretty common for the browser to spend 80% of its time just waiting for a connection, and only 20% actually downloading data.

  2. You consider jQuery cryptic and hard to maintain… that’s fine, that’s a valid opinion. All I can say here is that almost everyone disagrees. They think jQuery makes code incredibly easy to write and read. That’s why it’s caught on.

  3. You seem to be thinking of websites with either annoying or poorly implemented features, and casting the blame on jQuery. I think that blame is misdirected. If I found a junky website that didn’t use jQuery, should I conclude that JavaScript is to blame? Of course not. There exists, and always will exist, bad programmers, and bad programmers will write bad code whether they use jQuery or not. But good programmers who use jQuery do so because it lets them write simpler, shorter code.

It’s probably fair to say that libraries like jQuery make good programmers better, and bad programmers worse.

Unless you are running something in the scripts that manipulate the DOM or add/remove content/classes, etc… which most scripting nowadays tends to do… I think that’s the difference… if you’re just loading scripts for… well, the sake of loading scripts, then maybe. Splitting hairs really – if it feels faster to users to put them in BODY, we should put them in BODY for that reason alone.

Preaching to the choir on that one. 200ms ‘real world’ average per file request, only 4 to 8 at a time for overlap, and on crappy connections it can be as much as two seconds; Trust me, I regularly visit Coos County NH where 33.6 dialup is the fastest connection available at ~1200ms ping times. Can’t even get wireless phone coverage. It’s why I advocate NOT using endless mutliple scripting files and combining them down before deployment, and using image recombination techniques like the incorrectly named “CSS Sprites”.

Off Topic:

Always feels like a dream when I get home to 22mbps at 100ms ping to most of Europe and 500ms+ to chicagoland. (that whole new england backbone divide thing – faster 3000+ miles across the pond and to all of Canada than I am to Boston which is only 110 miles away).

Yup, and it makes me wonder just what the devil is in that kool aid… Maybe it’s that three decades plus of writing software, much of it in interpreted languages that drilled home the concept that libraries in an INTERPRETED language, much less one reliant on a narrow pipe, is a REALLY BAD IDEA.

Off Topic:

Or that I’m a Wirth language fan, and as such HATE unnecessarily cryptic code. Why I think Rust, Ruby, and a whole host of other languages are idiotic nonsense; Hell, I’d rather hand assemble 8k of RCA 1802 machine language and enter it 8 bits at a time on toggle switches than deal with 100 likes of C code for the same reason; pointlessly and needlessly cryptic… and when it’s pointlessly cryptic compared to machine language, why bother even having the high level one?

Which is funny because I’ve rarely actually seen that – usually it’s bigger with jquery, but that’s typically because the people who use it then rant and rave about nonsense like “don’t use WITH” or “objects for EVERYTHING”. Either that or it’s stupid annoying animooted nonsense that shouldn’t even be on the website in the first blasted place and just pisses me off as a user. Really that’s about the only thing jquery allows coders to do faster/smaller; annoying animated nonsense that just gets in the way of delivering CONTENT.

I’d say it makes them both worse – because I’ve yet to see a website not made worse by it’s very presence.

I think I understand what you’re getting at with this point, that if you have some JS widget that has its own set of images, that those images won’t start downloading until after the widget has been created. If I understood correctly and that’s what you meant… it still won’t make a difference. Even if those scripts were in the head, you’d still have to wait for the DOM ready event, which means that widget still won’t be created any earlier.

If you’re going to make this debate about credentials, then you still need to remember that the best developers – such as those at Google, Mozilla, and Yahoo – don’t share your view. And calling it kool-aid is a poor excuse to ignore the opposing arguments.

My experience – and I’d wager that most people’s experiences – have been the exact opposite. Here’s a perfect example of simpler, shorter code using jQuery:

$(“p.neat”).addClass(“ohmy”).show(“slow”);

Without jQuery, you’d have to get all P elements, iterate over them, search their class names for “neat”, and for each one you find, add a class (being careful not to clobber any existing classes), then managing several concurrent animations with intervals, and all the while avoiding cross-browser pitfalls. You’d have to write at least one, probably closer to two, dozen lines of code. It’s a lot more work, a lot more time, and a lot more opportunities to introduce bugs.

Alternatively, with jQuery, with can write one very clear, very concise statement, which has been widely tested and refined for both correctness and performance.

Again, you’re misdirecting criticism of bad programmers or clients with silly requests, and laying all that blame on jQuery.

So two lines of code for that – consisting of a double getElementsBy___ followed by a for/in and .className+=" ohmy"… Yeah, so tough and so worth a 90k (30k compressed) library.

BINGO – as you said:

As that’s EXACTLY the type of animated garbage I was talking about.

Oh, as to Yahoo, how’s the joke Colbert made go? “I for one am shocked – yahoo still has customers?”

As to Mozilla, never been a fan of their buggy bloated garbage in the first place – If I wasn’t developing websites I wouldn’t even have the unstable slow pile of junk known as Firefox installed… and all you have to do to look for proof they’ve gone 'round the bend is look at their wonderful new project “rust”… Rust: the programming language for the people who thought C++ was just a little too cryptic – which is like saying the Puritans who came to Boston in 1630 did so because the CoE was a bit too warm and fuzzy for their tastes.

Then there’s Google – who used to be the champion of good sites and it got them to the top – and now that they’re at the top they’re pissing all over accessibility, bogging their search down with idiotic scripted nonsense which is why a great many of us have switched to duckduckgo for searches. Between the illegible color contrasts of switching the menu bar to grey on black, crappy absurdly undersized fixed metric fonts on all the controls, the bandwidth hogging train wreck of predictive search – it’s like they’ve forgotten what it was that made them great, and instead are aiming to replicate what flushed “ask jeeves” down the toilet around the same time as the dotcom bust.

Mmm… no, unless you ignore the most-used browser – IE – which doesn’t support getElementsByClassName. You’ll be coding that manually. You also haven’t factored the animation into the lines of code you’ll be writing.

Please don’t exaggerate my arguments. You know I was giving an example of how jQuery can be simpler and shorter. I never said or even implied that this one line alone is worth 30K. Real-world sites rely heavily on the full extent of the library.

Plus, you still need to grasp just how little impact the file size has.

A total of 95ms, only 13 of which was spent actually downloading the script. 13.

I’ll concede that too much animation can get annoying… but do you really consider all animation to be bad? That seems a bit extreme. And what do you tell your clients or your boss when they explicitly ask for it? “I don’t do no animations!”

I have to agree with Jeff on this one.

I’ve defended jQuery many times (primarily against DS), but here it goes again. Yes, there are a tons of bad developers that include jQuery into projects that are other 10KB of code because it’s the cool thing to do (or something). That is ridiculous.

However, if you are writing a lot of Javascript (like for a web app), jQuery can tremendously cut down on the amount of code you have to write.

For example, take these:


// Normal JS - 33 characters
document.getElementById('book');

// jQuery - 12 characters
$('#book');

With just that simple operation, jQuery saved me 11 characters. jQuery is currently 31KB compressed and gzipped. That’s roughly equivalent to 31,000 characters. It’d only take 1477 of these calls to make jQuery be a file size saver.

Yes, you wouldn’t do 1477 of this type of operation normally, but there are tons of different ways that jQuery makes things shorter. In a library that would normally be something like 100KB, that could be cut in half by using jQuery, with the added benefit of very reliable cross-browser functions.

As for Javascript being in the body, from all the tests I’ve seen it is definitely a good idea. It may cause your Javascript to finish up a bit slower, but your clients will see the page sooner. Hopefully you aren’t using much Javascript to change appearance of things, so most people will never notice. If you needed, you could also split it up so you load the DOM-changing stuff in the header and the interaction in the body.

As for Yahoo, while I’ve never been a fan of their site, they’re developer stuff seems to be pretty good (at least the high-performance research stuff… I’m not sold on the YUI or OOCSS).

Mostly just when it’s javascripted… because it’s annoying, buggy, slow, and typically requires more javascript than I’d ever allow on a page in the first place… but then I usually consider 70k an ideal target size for an entire template; HTML+CSS+SCRIPTS+IMAGES, not counting content images or text… with 140k being the upper limit of being acceptable. Of that, if there’s more than 10k of scripting for the page, and it’s NOT an actual application like google maps or apps, then you’ve got a bunch of goofy crap that just gets in the way of using the page.

But I’m the same way about all this new desktop garbage. I click on minimize I want it gone NOW, not after some stupid animation runs. I drag a window I want to see what the window is going to look like while dragging, NOT have it distorted by some stupid animation. I click on a menu, I want to see what’s in the blasted menu NOW, not after some stupid animation plays… (and preferably the whole menu, not what I just so happen to have used recently – you know, what a MENU is for?) conversely I enter a search I’d like to finish typing BEFORE it starts looking and wasting my bandwidth and siezing the focus so I can’t even continue typing without the browser going off to never never land; preferably not with some stupid dog recycled from Microsoft Bob asking if it can help me like Clippy’s daft cousin.

I’m not rocking a 22mbps connection on a i7 870 with a GTX 560ti to have things run slower and in a less useful manner than they did on a AMD 568/133 running Windows 3.1 on dialup in 1995!

For the handful of people I still service (retired now), I explain that they are throwing accessibility out the window, wasting bandwidth and slowing the page to the point it’s going to double, triple or even quadruple the bounce rate, and explain that – much as I tell the photoshop jockeys – people visit websites for the CONTENT, not the stupid slow broken scripted garbage you put on top of it, the ‘gee ain’t it neat’ graphics you put on top of it, or any of the other “gee ain’t it neat” nonsense that is increasingly making the Internet less useful to users than it was a decade ago.

Because “hey neat” lasts five seconds, “damn annoying, I’ll go somewhere else” lasts a lifetime; and when it comes to scripting, the 2 million plus noscript plugin for FF users, couple hundred thousand users of workalikes in Chrome, and millions of Opera users (84 million doesn’t sound quite as dismiss-able as 3%, does it?) who use per site script blocking know exactly what I’m talking about!

Look at webmail – ALL of them; so knee deep in scripted asshattery it’s sending users scrambling back to mail clients. I’m actually enjoying M2 and thunderbird is alive and well – funny since just four to five years ago everyone was predicting the demise of mail clients.

Gmail, Yahoo, Hotmail – or server installable like roundcube – they all make squirrelmail look useful by comparison since at least it doesn’t break middle clicking on messages to open them in new windows/tabs or forward/back navigation! That they all waste several megabytes on delivering plaintext then have the cojones to sell the suits on it as saving bandwidth is mind-numbingly unbelievable… .and making Squirrelmail look good is like calling a 1984 Yugo GV built for the American market a quality automobile.

You’re blatantly misinforming your clients. Just FYI.

Two examples of which would convince me to not even LOOK at anything else they offer… one being more fat bloated idiocy, both boiling down to the use of presentational markup and completely missing the entire point of HTML and CSS.

Ok, try this – give me an example of one that DOESN’T suck. I have yet to see a site use jquery that wasn’t a total mess… Show me one that isn’t.