Are we in a scripting-dependency backlash?

I can’t count how many times recently I’ve seen a comment, tweet, or article discussing the overuse of JavaScript-driven UIs and MV* frameworks for websites and apps that, in most cases, could have been built just fine without these tools.

There are tweets like this one from Jeremy Keith, or this one from Nicholas Zakas. There’s also Peter-Paul Koch’s thoughts on AngularJS.

I think, in some sense, we are in somewhat of a scripting-dependency backlash, and it’s probably a good thing. While ARIA roles and other accessibility features can help, dependency on scripting for displaying simple content seems to be taking things too far. Developers seem to be using flashy new tools just for the sake of it, and not because they are solving an actual problem.

Sadly, we still see even basic content disappearing when JavaScript is disabled. A great example of this is TIDAL, Jay-Z’s new music streaming service. I don’t expect the service itself to work without JavaScript, but, as I pointed out on Twitter, the TIDAL home page displays just a black screen if JavaScript is disabled. Awful!

Like many others, I still feel that the best way to approach app development is to ensure your entire application functions with JavaScript disabled. Any JavaScript and Ajax enhancements can then be layered on top of that by intercepting the default functionality.

Of course, some apps are pointless without JavaScript. Code playgrounds like JS Bin and CodePen come to mind. But in a lot of cases, progressive enhancement techniques can ensure that users are not left without even basic content or functionality.

Do you have any thoughts on this? Are progressive enhancement techniques too old-school? Or is the scripting-dependency backlash a good thing?

This editorial appears in this week’s issue of the SitePoint Newsletter.

3 Likes

Progressive enhancement is generally a waste of time. Users with JavaScript disabled account for an exceptionally low portion of users and it doesn’t make sense to spend time or money on it. Not to mention, your users should not be expected to have reduced functionality because less than 1 out of every 100 people make the conscious decision to disable JavaScript. Why stop there? Why not support for IE6/7/8? There are at least 4-5 times as many users on that.

Cause supporting IE6 is a silly waste of time and resources. So is worrying about people with JavaScript disabled. Most of the reasons people disabled JavaScript are outdated and wrong, anyway. The whole argument about “what is JavaScript doesn’t load” is equally silly. What if the HTML doesn’t load? What if the CSS doesn’t load? The user refreshes, that’s what.

In the case of TIDAL, one should not expect to go to a live streaming music service and expect it to work without JavaScript enabled. However, rendering nothing but a grey screen on the front page of a website that is well funded and expected to reach the widest audience possible, it probably also not a good idea. Even disregarding progressive enhancement entirely, it should at least have a description of what TIDAL is and why you need to have JS enabled. That would take what? About twice as much time as it took me to write this post? Where as I think Meteor.com rendering a blank screen without JavaScript is perfectly acceptable, given that’s it’s specifically a site for a JavaScript framework geared towards developers.

5 Likes

There are a wide variety of reasons why JavaScript doesn’t load, and we’ve only covered the tiny tip of the iceburg here.

Yes, if you have trouble fetching content from the site, that can be one reason, and the conscious decision to disable JavaScript is another reason, but by far the main causes are out of the persons control.

Other reasons are (stoleninspired from Everyone has JavaScript, right?):

  • The corporate firewall
  • ISP’s and mobile services can interfere
  • Browser plugins can inject alterations that get in the way
  • The CDN can be unavailable
  • ES6 has compatibility issues

The whole school of progressive enhancement provides a whole new level of elegance on top of our craft. Anyone can put up a website, but providing the content first and ensuring that it’s capable of working in the great sea of forms of access and capabilities that are available - that’s a trick that we’re still learning to master.

Mostly it just takes time and effort to develop the muscles to achieve this, which is why people turn to frameworks so that they don’t have to do as much thinking. Coming off those frameworks helps us to exercise those muscles once again.

2 Likes

To me a lot depends on if they’re being used as a tool or a crutch.

If I had to churn out websites in rapid succession then I would consider using a framework as a tool to use as a time-saver.

Unfortunately, as Paul said, it seems many use them “to do what they don’t know how / want to do”.

Technology changes, and those that use frameworks as a tool should hopefully be able to switch to some other way (albeit likely not without some grief). The problem will be for those using them as a crutch, because they’ll need to either learn what they should already know, or find something else to lean on.

Not if you also turn off CSS :wink:

1 Like

Heh, that’s effectively progressive enhancement. :stuck_out_tongue:

Hm, if only it were that simple. Every second time I go to Facebook or Google services the JS doesn’t load, for some reason, and no amount of refreshing fixes it. I have no idea why I get such a bad connection, but still, most of the content I miss out on need not be due to JS.

Here’s another brief discussion of the issue, to add to the above: http://alistapart.com/article/let-links-be-links#section1

Yup. I get tagged by several of those several times a day, especially number one.

I have no problem with javascript adding value and “sparkle”, but for it to be dependent on it just to give basic service when it’s not necessary drives me insane.

1 Like

I guess in a way. I could consider it more of a fallover than progressive enhancement. Discourse is more of a fallover, than progressive enhancement (or even graceful degradation). It is definitely an afterthought.

I can understand that, but I also don’t feel that it should have an effect on the end result. By working from the base level up, then you’re effectively limiting what your app can potentially be. Whether you meant to or not or whether you try to let it limit you or not. It will be. No matter what. Especially if you’re considering using a MV* framework, you’re app is probably meant to be highly functional. You’re punishing the many, for the actions (or problems) of the few.

And if you’re going this far, why not support older IE’s?

Here is a snapshot from last month’s stats on the app I work on. It’s based on 10’s of thousands of unique users, all in corporate environments across different companies.

IE 11: 20%
IE 9: 20%
Chrome 41: 16%
IE8: 15%
IE 10: 10%

If you’re worried about corporate firewalls, why is there such a push to drop IE8 support? Because… you can’t move into the future by providing support for everything in the past and every possible curve-ball your users throw at you. The funniest thing is, a lot of the places I’ve seen that push hard for progressive enhancement, do not provide any support for < IE10.

CPU manufacturers keep pushing the envelope, trying to create product that outperforms competitor product, bringing us closer and closer to the physical technology limit (at least until new technology replaces current technology.)

Restaurants keep formulating new recipes or combinations, trying to create dishes that will be more appealing than competitor dishes (like the ‘restaurant wars’ mentioned in “Demolition Man”.)

Beverage manufacturers keep introducing new ideas into the liquid refreshment market, trying to make sodas/teas/beers that will be more appealing than competitor beverages (the ‘cola wars’.)

What do all of these have in common with ‘scripting-dependency’? Consumer drive. Whatever is more appealing to the masses drives the direction creating things for mass quantity consumption. Sadly, this means that a lot of the empty flash and pop that so many sites/apps are integrating is because it is thought to be more appealing to a vast majority of the target demographic; any complaints are generally few and are drowned out by the shouts of demand by the majority.

Consumerism, capitalism, greed.

V/r,

:slight_smile:

2 Likes

While I agree that progressive enhancement is a waste of time, it’s still a good idea to practice NoScript solutions as they can be faster to implement and definitely much [much] more lightweight.

By using NoScript solutions, you can workaround progressive enhancement, as there’s nothing more to progress.

I literally just made this now to prove it. It’s a fluid, NoScript UX with “keyboard shorcuts” that runs smoother, and more reliably, than some JavaScript alternatives.

http://codepen.io/Stemlet/pen/ZGGedm

1 Like

While I am in favor of graceful degradation, are there any stats that indicate how many users do not have javascript enabled?

Don’t browsers (desktop, tablet, and smartphone) enable javascript by default? The only exception I could think of would be a dumb phone; but I doubt that accounts for a large Internet-user base.

Plus, the common user probably doesn’t even know how to disable javascript (thus, keeping the default of enabled javascript).

The best I’ve seen is from 2010, which is before the Smartphone revolution.

https://developer.yahoo.com/blogs/ydn/many-users-javascript-disabled-14121.html

http://ydn.zenfs.com/blogs/1/js_disabled.jpg

After crunching the numbers, we found a consistent rate of JavaScript-disabled requests hovering around 1% of the actual visitor traffic, with the highest rate being roughly 2 percent in the United States and the lowest being roughly 0.25 percent in Brazil. All of the other countries tested showed numbers very close to 1.3 percent.

Then there is a UK specific post from 2013 that shows about the same:

https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missing-out-on-javascript-enhancement/


The numbers are very strong against supporting JavaScript-less users.

2 Likes

I use to believe this is important but after working for several companies all which could care less it is just easier not to care. Honestly, as mawburn said it just isn’t worth the trouble when you compare the numbers. That being said optimization can play a huge role in using a single page application. I rather have a faster site for most of my user base using something like angular JS with a rest api than a something much slower but also works for less than 1% of users. The fact of the matter is a single page application will always provide a more fluent and fast experience than something that always has to keep reloading the page which is one reason they are hot right as web applications become more and more complex and interface with multiple technologies and vendors.

1 Like

I think that’s what it all boils down to. Web development is essentially a trade, like plumbing or landscaping. Those trades could all have standards movements equivalent to web standards, but in reality, it’s still going to be hard to find a tradesperson who is going to bother to strive for excellence when most customers are looking to pay as little as possible. As a customer, even when you are prepared to pay more for a good job, it’s hard to find a tradesperson who won’t just do the bare minimum and rip you off.

As mentioned above, that’s not really the issue—though it’s really hard to get this message across to those who don’t want to hear it. I got into web design during the renaissance of web standards, and it’s really sad to see it all go down the gurgler so quickly. :frowning:

Progressive enhancement is the best way to code for the web. It always has been. If you’re using mobile first, that’s a PE technique.

That’s not to say every web application needs PE. Your game or video editing app is never going to work without canvas, JavaScript etc. But those cases are relatively rare.

The comments above reject PE because a tiny proportion of users disable JS. That’s not the issue. The problem is the plethora of devices we have to support from feature phones to screen readers to tablets to PCs. It’s impossible to test everything but, with PE, you don’t need to. You’re writing defensive code; if any aspect breaks or fails, the user will still see something (presuming HTML is returned). PE is no more effort than whatever you’re doing now. It’s a way of thinking - not a technology. There’s absolutely no reason to reject it.

Sites I built in 1996 still work today. Of course they’re awful but they’re usable. Will you be able to say the same thing about an Angular-powered site in twenty years? Or even five?

So yes - we are seeing a scripting-dependency backlash. And it’s about time.

5 Likes

I’ve got to agree with ceeb. Progressive enhancement is still the way forward after all these years.

Sadly, experience of the real world is that projects you build aiming to progressively enhance with javascript at a later date are rarely revisited and enhanced. A business looks, sees the project complete, and marks as job done and you’re on to the next project that will try to create a profit or, for internal developers, cut a cost.

The isomorphic javascript frameworks are trying to improve on this so that users can get that enhanced experience with the first release, rather than the clunky sites of old that do work but aren’t engaging.

At the end of the day, your users will be expecting a fast and responsive (in the old school meaning of the word) experience. Javascript delivers that, frameworks provide a way to marshal the code base into something reusable and users expect a certain degree of interactivity. It’s trendy to knock javascript but it’s not going to go away and is only going to improve.

2 Likes

Recently, I asked a question on Quora with a similar context. Is it better to generate HTML on the Server using a scripting language such as PHP and then render the complete page to the browser OR is it better to use the PHP to access a database in which is held, the data in JSON format, that is needed to generate a page. The JSON is then sent to the browser. The JSON instructions are turned into a web page by a rendering engine (in Javascript) that had been downloaded to the browser when the session/connection was created?

I have developed my own Framework (Cliqon) which does the latter. My eldest son and grandson, both programmers, were trying to tell me that the former was the best approach.

I knew them to be wrong but wanted the opinions of other Programmers on Quora about the matter. Every single respondent supported my way of doing things.

To make matters worse for those involved in PHP etc., the arrival of PHP7 will not go well and will herald the gradual demise of PHP, to be replaced by Node or its companions. Thus as far as I am concerned, developing websites in properly written Javascript (at both levels Client - Server) is the way forward. My Cliqon supports both PHP and Javascript code as backends (it only requires a change to the URLs in the AJAX calls).

So from my point of view my response to your original question is straightforward. I consider your question to be ill judged and if required to give an answer, the answer to it would be absolutely NO.

The point is: you can do both.

Let’s consider a set of search results. The first page load returns HTML. It’s quick: there’s nothing for JS to do. When the user clicks “Next”, the second page is loaded:

  • If JS isn’t available, that’s a second page of HTML.
  • If JS is available, you make an Ajax call, fetch JSON results and update the DOM.

The system remains usable on all devices regardless. Is it any extra effort? No - not if you do it right from the start. That’s progressive enhancement.

That said, if you’re simply serving website content, why is it necessary to have a dependency on JavaScript? It’s additional complexity with no benefit to you or the user.

2 Likes

I’m not a professional web developer so my view is from the consumer standpoint. It just seems common sense that you want maximum access to your site especially if it is a business. Many sites now cause my iPad to crash or become so sluggish as to be unusable. It is very frustrating and, as I do understand the web a bit, I can’t understand why. It worked perfectly only 6 months ago. I understand that it won’t be able to run the latest apps/games but I know that there is no real reason why I shouldn’t be able to buy something over the web.

So the problem from my perspective is not javascript per se, it is the constant push to use the very latest version. I can’t help comparing this to other things in life. Take a fuel pump at a petrol station. Imagine going to refill a car and finding that the upgraded petrol pump is no longer compatible with your filler. Significantly this would be frustrating for you but a loss of business for the petrol station.

1 Like

That they’re awful is not being contested. :stuck_out_tongue:

Where I’d go from here though is to argue that due to the progressive enhancement manner in which they were built, that you can today fairly easily make further improvements to them to meet today’s demands.

Is there anything that would make you think they wouldn’t?

All the tacky JS animations and flash I made in Swish, still work fine 15yrs later. Why would things built on the current base of standards (JS) not?

Will I like what I’ve done in 5 to 10? Probably not. But they will still work.

Yes, there is at least double the effort in your description. I now have to make sure the server and the client can render the information in the exact same exact way. These are 2 entirely different processes.

Not to mention, what if I have other pieces being updated by JS? Usually a page that updates by AJAX, is being updated in multiple ways. So now, not only do I have to make your new Next request work identically on the Client and Rendered by the server, I have to make sure all these other little pieces have done the same.

Now you’re talking about even more things I have to double up, which has now turned this Next page into an engineering headache.

I mean, you can nonchalantly say this isn’t more work. But I sure you, it is absolutely not a trivial amount of extra work.

That said, if you’re simply serving website content, why is it necessary to have a dependency on JavaScript? It’s additional complexity with no benefit to you or the user.

I agree with this. If the site is content based, where UI/UX gets pushed to the side, then it should stay content based.

All I want to do on a blog post is scroll up and down.