2Mb Web Pages: Who’s to Blame?

An excerpt from http://www.sitepoint.com/2mb-web-pages-whos-blame/, by @ceeb


I was hoping it was a blip. I was hoping 2015 would be the year of performance. I was wrong. Average web page weight has grown another 7.5% in five months and exceeds 2Mb.

According to the May 15, 2015 HTTP Archive Report, the statistics gathered from almost half a million web pages are:

The biggest rises are for CSS, JavaScript, other files (mostly fonts) and—surprisingly—Flash. The average number of requests per page:

  • 100 files in total (up from 95)
  • 7 style sheet files (up from 6)
  • 20 JavaScript files (up from 18)
  • 3 font files (up from 2)

Images remain the biggest issue, accounting for 56 requests and 62% of the total page weight.

Finally, remember these figures are averages. Many sites will have a considerably larger weight.

We're Killing the Web!

A little melodramatic, but does anyone consider 2Mb acceptable? These are public-facing sites—not action games or heavy-duty apps. Some may use a client-side framework, but they should be in the minority.

The situation is worse for the third of users on mobile devices. Ironically, a 2Mb responsive site can never be considered responsive on a slower device with a limited—and possibly expensive—mobile connection.

I’ve blamed developers in the past, and there are few technical excuses for not reducing page weight. Today, I’m turning my attention to clients: they’re making the web too complex.

Clients normally view developers as the implementers of their vision. They have a ground-breaking idea which will make millions—once all 1,001 of their “essential” features have been coded. It doesn’t matter how big the project is: clients always want more. Feature-based strategies and “release early, release often” are misunderstood or rejected outright.

The result? 2Mb pages filled with irrelevant cruft, numerous adverts, obtrusive social media widgets, shoddy native interface implementations and pop-ups which are impossible to close on smaller screens.

But we give in to client demands.

Even if you don’t, the vast majority of developers do—and it hurts everyone.

We continue to prioritize features over performance. Adding stuff is easy and it makes clients happy. But users hate the web experience; they long for native mobile apps and Facebook Instant Articles. What’s more, developers know it’s wrong—Web vs Native: Let’s Concede Defeat.


Continue reading this article on SitePoint!

6 Likes

I totally agree with it being worrying/annoying/irresponsible but people do it because they know they can get away with it. Page weight has scaled with internet connection speeds. From dial-up to broadband to “superfast” broadband and 4G, as the technologies have become more widespread, page weights have increased because developers can get away with doing so.

2 Likes

I think you’re making a mountain out of a mole hill. it really isn’t all that big of deal. Why limit software to the lowest common denominator. That just makes no sense and spits in the face of hardware advancement.

While it may be true that internet connection speeds are increasing, this isn’t sufficient reason to push for heavier page weights. While some of what is being done can be beneficial, most of it is pure bloat. This, in and of itself, is not worrisome. However, when you take into consideration all the other cr@p out there that is contributing to choking the bandwidth (specifically SPAM), every little bit of conservation helps.

Not to mention that not everyone is on “broadband” internet (not the current definition, nor the earlier, tiny definition.) There are still a large number of people who are using “dial-up” internet, and many DSL packages are only half a step above dial-up speeds. @oddz, in my humble opinion, seems to suggest that we leave those unfortunate souls in the dust. I heartily disagree. Many or most of those in that slow category aren’t there by choice. Either they can’t afford a better connection (because, let’s face it, introductory pricing almost doubles after the 12 months is up), or there is nothing better available in their communities.

So, no, I don’t think this is making a mountain out of a molehill. I think this is right on target, and should not be so easily dismissed. As developers, it is part of our responsibility to try to trim the fat whenever possible, without sacrificing quality as much as possible. It is understood that not everything can be easily trimmed, but we should at least make an effort.

V/r,

:slight_smile:

5 Likes

Because of the mobile market. The market I live in has limited high speed coverage, and limited free wifi locations. So these script heavy sites (often unnecessarily) load slow, if at all.

4 Likes

What proof do you have that most of it is bloat? In order to quantify bloat one would need to be knowledgable of the business requirements of any given project. A project that contains large amounts of JavaScript, CSS, images isn’t necessarily “bloated”. Websites exist to fulfill business needs. I will say it again – clients nor developers shouldn’t be limited by the lowest common denominator.

css is very rarely bloat. If you’re doing some animation which doesn’t add benefit, perhaps (but even that is better than most of the javascript animations)
images can be bloated if they’re not optimized for web (amazing just how many people don’t even bother with that step)
javascript can be valuable IF it adds additional benefit. If it’s doing something that comes natively to the browser via css or just plain html markup, it’s bloat plain and simple.

No, but this argument harkens back to the old blink tab or animated gifs - just because you CAN do it doesn’t mean you SHOULD do it, and it’s your job as a designer/developer to help guide a client to what’s appropriate and professional, and some things should be used judiciously, if at all.

4 Likes

I think of it as “Netiquette”, having consideration for others.

When I first got online years ago (dial-up cnx) I was so excited about being able to send photos to friends I did so.

Until I later met one at a party who said to me “Please don’t send me emails with so many pictures, I have a slow internet and they take too long.”

Once I became aware, I stopped the practice.

I imagine it’s similar today. Those that have high octane set ups never see a problem and don’t even realize the pain they might be inflicting on others.

5 Likes

Well, bandwidth costs money, and some pretty prominent studies have shown that reducing page weight and speeding up load times has a huge effect on business returns. I’ve recently read a new book on this—Lean Websites, available at Learnable—and it’s an eye opener.

2 Likes

Well either way nothing is going change.

That comes to knowing your audience. For example a supplier of luxury brand goods need not care about the lowest common denominator. I said it before and I will say it again the internet is a privilege not a necessity. So if x site doesn’t work for person y on a poor connection no real harm is done. It is after all JUST the internet. No need to get panties in a waddle.

Normal users don’t care about page weight. They just care about load time. If that is a argument than I fully agree. At which point I would say there are many ways to improve the performance of any website without limiting features to that of serving the lowest common denominator.

I don’t think the article is about limiting features, but about optimizing them, and making sure they add value. Optimizing a 2Mb image isn’t limiting a feature. A fast-loading site is really important.

4 Likes

I think the main cause of large web pages is that it is so easy to include an enormous CSS file just to obtain a single feature. This tactic is repeated numerous times to obtain another single feature. Same applies to JavaScript files.

Maybe there is a niche for extracting relevant features into a combined CSS and JavaScript file and most importantly to omit the repeated dross.

Perhaps Googles new emphasis on mobile friendly sites will bring page weight down.

1 Like

Sometimes you can justify a heavy page weight for business reasons. If you don’t have enough time or money to implement a nice lean clean site, but can chop out a bloated one which does the job for the primary target market (if the primary market is mainly those with high speed connections, in London for example), then it can be the difference between a business existing, or not. But I think this can only be justified for short periods of time when there is a high risk business strategy, and should be properly redeveloped for the long term once proof of concept has been established, or a time sensitive market has been grabbed.

However there is a very destructive mentality. If you think, “oh a bit of bloat doesn’t matter”, a bit is introduced, and a bit more, and a bit more. Eventually you will realise you have a problem as performance and/or maintenance degrades, and cutting away bloat from a system that is very bloated is a huge task. It is often cheaper and easier to throw it away and start from scratch.

1 Like

When someone loads the complete jQuery library (dev full, or production minified) for one or two things, that’s bloat. And it happens way more often than one might suspect. Same for MooTools or any other JS library. There are some developers who either don’t want to learn vanilla JS, or aren’t in a position (for whatever reason) to manually create or even copy and modify vanilla JS to get an effect that jQuery or MooTools or whatever can do with a few simple lines of code.

V/r,

:slight_smile:

2 Likes

It’s not even that developers don’t know vanilla JS.

Think about what’s easier? 5 minutes to load up the jQuery library, pass that weight onto the users, and write the appropriate function? Or spending 3 times that amount on vanilla JS?

It’s so much easier to use jQuery/UI, etc. Although I load up jQueryUI, jQuery, validation jQuery plugin, additioanl methods plugins for that, a plugin for sticky navigation, scrollit, and , scrollflow, along with some other custom jQuery for on-screen stuff, with a decent amount of images and 1 custom font. That page weight is 450kbish last time I checked.

This page wasn’t even minified at all.

There is no excuse for 2mb page weights. I literally threw everything at that page possible. It’s disgusting to think people have 5x that page weight (AND MORE! 2mb is just the average.)

1 Like

Or jQuery AND another set which does the same thing (MooTools + jQuery isn’t uncommon)

1 Like

Can you specifically point out sites that load jQuery and only have one or two lines additional, custom JavaScript? I agree that aggregating files and reducing asset requests is important part of optimization. However, that isn’t always possible with legacy systems without introducing a significant amount of risk and overhead. Not to mention doing so further increases the overall complexity of managing the system. How many times have I heard web designers and front-end developers complain about the complexity of things like grunt – to many though I’m all for it.

He said “things” not “lines”. They are different.

Alright, show me websites with one or two “things”.