Page movement and usability

Hello everyone.

I am working on a web page. I have completed the HTML part of it for now and I validated to make sure that there were no errors. Now, I want to do a one page portfolio and I’ve noticed that some of the nicer looking ones seem to use Javascript, JQuery or other forms of scripting to achieve a kind of movement while the viewer looks at the page ( ex. Steven Little Design : Web Design, Print Design, Identity Design… really anything design). I’ve noticed that this page moves horizontally as you click each menu option.

On one hand, it looks nice. On the other, it seems a little cumbersome. However, I want my portfolio to stand out and pages with that kind of animation seem to be the standard. At the same time I still want page that can be viewed even if you don’t have Javascript on or if you hate that kind of stuff. So what would you guys recommend?

Build the page first with HTML and CSS so that it looks reasonable and is usable. Then add JavaScript to incorporate whatever behaviours that you want.

Those without JavaScript will then see and use the page the way you saw it before you added the JavaScript. All scripting has to be done in JavaScript because that’s the only scripting language all browsers support (jQuery etc are all written in JavaScript and just provide you with some predefined code you can run in your script).

Yes, check out that page you linked to with JavaScript off. O my. :nono:

Given the absurdly undersized fixed metric fonts, content that doesn’t fit the majority of people’s screens (particularly on height), color contrasts below accessibility norms (that black on dark blue being particularly helpful – though the blue on blue in an illegible script is really cute too) and fat bloated jquery asshattery … I’m not sure I’d be looking at such a page as an example of ANYTHING to mimic… In fact, it’s a great example of how NOT to build a website – being this centuries equivalent to animated gif background and auto-playing midi music.

Cumbersome doesn’t even begin to cover the effectively useless accessibility train wreck that steaming pile of manure amounts to.

If you’re inner art child is longing for that type of page, do yourself a favor and kick them square in the junk. But then, 99% of the animated crap you find on websites is… well, as we used to say when people pulled those types of stunts using Flash, “There’s a reason it’s called flash, and not substance”.

This has become really tiresome. Sorry, ralph.m, but if a user chooses to cripple his own browser, he has no one to blame for the outcome but himself. Javascript isn’t some exotic addon or plugin - it’s a fundamental part of every browser. If you turn it off, some things aren’t going to work. That’s not the site developer’s problem.

No, you blame the {nasty expletive omitted} developer who didn’t use javascript properly on simple elements – building without progressive enhancement and therein failing to have graceful degradation – which means MISSING THE ENTIRE BLASTED POINT OF USING HTML IN THE FIRST PLACE!!!

Just like the “designers” who use fixed height images behind flow content, illegible color contrasts and absurdly undersized fixed metric fonts!

Some people can’t stand all that JS stuff and just want to get to the content. Too many designers get mesmerized by “bling” and forget about the primacy of content. To make it impossible to get the content without JS on is just bad design. It’s not hard at all to make content accessible with JS off.

In fact it should be the first step before scripting is written. How or even why would you write a script without some idea of the content or the semantic markup for that content? Unless you’ve got your head wedged up 1997’s backside at the pinnacle of HTML 3.2; you wouldn’t.

Which is where you see so many failings in design and development – people writing scripts or even HTML and shoe-horning content into it instead of writing proper semantic markup and progressively adding to it with all the neat stuff.

Which is when I don’t have a problem with all the ‘gee ain’t it neat’ stuff – when it’s written to accessibility norms using the common sense and proper PURPOSE of HTML… Progressively enhance the content instead of shoehorning the content into “gee ain’t it neat” bull that pisses all over accessibility, cross device compatibility, etc, etc…

But of course we’re still fighting the pollution and damage of HTML 3 at what’s rapidly closing on a decade and a half later.

Sure, but there are some things like sliding galleries that usually come ready-made. So I was talking about tweaking them to work with JS off. I know that’s the wrong way to go about it, but it’s how most people will come at it (and frankly, who’d bother doing a sliding gallery just for HTML and CSS? They’ve a bit stupid anyway. But of course, they should still “work” as HTML and CSS if you have them.)

That isn’t always a fail on the developers part but one of the consequences of giving a customer to much control. Tools like ck editor, tinyMCE and other are invaluable for allowing just about anyone to “design” the content area but always output god awful HTML. Its just the nature of the beast. its pretty much either only plain text, plain text with some special tags or full control not much wiggle room in between. Depending on the customers business goals one can be better then the other but they all come with sacrifices. The former sacrifices on the end user in regards to flexibility and the later sacrifices from a technical end. Its just the way it is sometimes.

Yes, good point. The days of static sites are fading fast, and it’s a real dilemma when setting up a CMS to work out how to minimize the damage clients will unwittingly do when they start to enter content. You can restrict what they do, but it’s a losing battle.

True! For a good example of how NOT to apply JavaScript please visit the SP Blogs and try to comment there; using a user-agent that doesn’t support JS or has it disabled.

You cannot full stop! :wink:

It’s called ‘discrimination’ caused by the web author erecting web accessibility barriers. For example: Thinking Web: Voices of the Community » SitePoint there you can clearly notice the difference and see a whole new world of comments on that page (on a browser minus client side scripts). Plus you cannot even attempt to comment there.

The ‘primary objectives’ of SP those Blogs are to allow you to both read and comment upon them. The second objective you cannot achieve unless the user-agent is running JS thus it constitutes “failure” of core functionally.

We aren’t talking bells-and-whistles or preview buttons or editing boxes but a simple REPLY mechanism. Which could easily be created using reasonable adjustment by the commissioned web author without the need for JS.

The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect.
JavaScript should only be an added extra or as a progressive enhancement. How many people have their browser set to block auto popup windows or adverts that have JS enabled? I suspect many, if you take the approach JS is “fundamental” then the group (with JS enabled) that have popups or adverts blocked should all be rounded-up and shot with the users with JS disabled.

Actually, CKEditor’s output isn’t that bad, especially if when setting it up you force it to show the block level containers, give the user a separate field for the topmost heading for the section they are editing, and nebfer the “format” box to only show headings lower than that. I usually further mod it so bold is bold and I is I, add my own STRONG and EM (since the four all have different purposes), yank or restrict color setting, change the font dropdown to list percentages or remove it entirely, change the left/center/right buttons to change the element to classes like ‘.trailing’ and ‘.plate’ – and make other modifications so it isn’t outputting quite as bad a code. In fact, ckeditor is one of the few WYSIWYG’s you can put into a CMS that’s worth a damn… at least once you use it’s API to “lock it down” – especially since it outputs proper opening and closing tags on all the block level containers, by default building with P properly when you hit enter (and making BR when you shift-enter). You can also ‘coach’ the users by giving the editor the same CSS as the page would receive.

… and even if the wysiwyg inside the CMS is outputting crap content, that leaves no excuse for the template around it since there’s typically more to content than what the user is typing into a CMS textarea… like the heading with it’s link, date, who posted, the footer with it’s social links, number of comments, etc, etc…

Though yeah, if you’re talking TinyMCE, the code that vomits up is terrifyingly bad.

You boys are all setting yourselves up to be completely left behind, regarded as hopeless fossils. Enjoy.

[ot]

You boys are all setting yourselves up to be completely left behind, regarded as hopeless fossils. Enjoy.

That is so cute. Which do you hate more, people who don’t use Javascript, or people who still use IE6? Or is it a tie?

Being a member of a hated and at times persecuted minority, I call for someone to step forth as our Professor Xavier! He can start a school for scriptless mutants for those who want to uphold Tim Berners-Lee’s hippy ideal of freedom of information (not just free as in beer, but free as in speech)!

Next argument, net neutrality: those who agree to pay more should get better access to information on teh interwebs. After all, if they choose not to pay, they can just be happy with closed doors and broken interfaces. At the very least, they shouldn’t complain: it’s the future.

Discrimination is the future. Just wait til we can’t do online banking without WebGL (I don’t even own a single machine that can even manage that). Participation in the world wide web and the so-called freedom of information will be limited by your hardware budget and the size of your tubes. AS IT SHOULD BE, MERE UNWASHED MASSES! YOU ARE BUT WEB PEONS. Egalitarianism is for hippies.

Simon Willison:

The Web for me is still URLs and HTML. I don’t want a Web which can only be understood by running a JavaScript interpreter against it.

(notice that link does not work without Javascript. Because anchors with hrefs, the working protocol of the web, is just NOT HIP ENOUGH)

One can argue CSS is fundamental to a browser, but a website that relies on it is, no other way to put it, badly written (notice this is different than the media determining hardware/software… you can’t run XBox games on a Commadore and you can’t play videos in Lynx). One that relies on images is badly written. One that relies on colour is badly written. One that can’t talk to the server without client-side scripting is badly written. But we all want the web to be open to all authors, right? We want Joe to be able to whip up his motorcycle blog without any (web) knowledge whatsoever. This means 99% of the web will ALWAYS be badly written. But so long as Joe doesn’t claim he’s a “web developer” it’s between him and his visitors.

Anyone calling themselves a “web developer” who cannot build a page without needing crutches, libraries, and non-universal technology doing all the work for him or her brings the rest of us who know what we’re doing down to the lowest level. This is why I don’t complain about “frivolous” lawsuits when brought against companies who hired “web developers” who can’t do their jobs correctly and build broken pages.

But the web will continue to become fragmented, and there will be separate drinking fountains. We’ll either learn to live with it, or we will have a Civil Rights movement.

In the meantime, those of us who know what we’re doing can charge more for sites that won’t become the target of a lawsuit from someone with a disability or lack of some hardware or software. See, we can always layer a crapload of scripts (just like we do CSS and images blah blah) on top of working stuff, so users with all the bells and whistles don’t know the difference. Oh, so terribly terribly sneaky. Why, it’s downright deliciously devious!

Someone give me a cookie.[/ot]

[ot]We might be “fossils” but you should beware of carelessly breaking valuable “historic” statues…

I am a visitor of a site it is (should be) my choice; whether I run JS, display images or allow CSS not the Web Authors to force me.

I might not fancy the idea of an ‘epileptic seizure’ being brought on because some author thought it was fun to create a strobe effect with JS or some other ‘cool effect’. Surely I am evil if I disable JS because the author wrote it and the same with malicious scripting… I should allow that. Does the author pay for my bandwidth too?

If JS is used “correctly” the site will still function (reasonable adjustment) that is what we are talking about - not blanket cover.

We differentiate between the two; obviously you cannot make a working interactive graphics application with markup alone we accept that. So don’t automatically assume we are being cantankerous. :-)[/ot]

[ot]There’s just something about it that’s almost Amish. There are lines being drawn in the sand - “We will use this technology, but not this other thing.” There are always long, involved explanations about why the line is drawn where it is, and they seem to make sense at first blush. But at the end of the day, the result comes across as being arbitrary and capricious. They’ll use a cell phone outside, but not in the house. Go figure.

So, you expect HTML, not just TXT files. You’re a bit ambivalent about CSS - you want to be able to turn it off. You object even more strenuously to JS. Ditto for Flash. And on it goes. Yet the site should remain accessible to you.

As a developer, do I really need to develop a page that will provide you with a reasonably complete experience even if you turn off everything? Should I use only bright colors in case you set the brightness on your monitor down to almost zero? Should I never use an accented character in case you’re restricted to ASCII? Do I have to provide a graceful degredation path if you’re using an LA120 with a 1200-baud modem to access the site?

When it comes to JS, I don’t think I’ve ever turned it off, and I’ve never hit any sort of exploit, or any page that produces some sort of visual horror. Perhaps I should start visiting porn sites? :wink:

By the way, I’m not a youngster, overly enamored of all that’s new and shiny. I go back to mainframes, C before there was a ++, RAM that cost several thousand dollars per MB, and so on. One of the things I learned along the way was never to let myself get “stuck” at some stage of the evolution of the IT world (“We don’t need any of this new-fangled stuff. What we had five years ago was good enough!”). That’s a trap to be avoided.[/ot]

Its not that content is not accessible its rather that people neglect to make use of tools to access the content. Its like complaining you can’t reach a shelf when a step stool is right in front of you. In most cases its the responsibility of another companies product to bridge the gap between the majority of the population and outliers. People shouldn’t point fault at the people building web presences because there is no money in building tools for those with disabilities to access the web without major sacrifices. That is just how the world works. Now I am no way saying all websites should be JavaScript based only and use images for text only, not at all. Merely that when you make the choice to turn things off or sadly live with a disability that the experience and even content may not be accessible as someone who is the norm. That doesn’t mean we shouldn’t try but everything comes down to money.

[ot]

It does depend on what the site is trying to do, but for the most part on the majority of websites – YES! Why? BECAUSE THAT’S THE POINT OF HTML IN THE FIRST PLACE!!! Device independance (sometimes called device neutrality) is the entire reason to use HTML to deliver content. We got away from that during the browser wars and many developers never returned from that dark evil place, but that’s what strict, separation of presentation from content, and progressive enhancement is all about.

It’s why the best (IMHO) approach – even if you’re handed a goofy picture by a PSD jockey – remains coding semantic markup first (pretending the pretty picture/layout doesn’t even EXIST), creating the layout with CSS, going into the paint program to make the images (or slice out of the original PSD if working from that) and hang them on the layout using CSS, and finally to add scripting behaviors to improve the user experience, avoiding ones that just suck on bandwidth like candy and provide no REAL enhancements to functionlality.

You work that way, and it’s no harder to have things work as the “gee ain’t it neat” technologies vanish. Progressive enhancement means graceful degradation. Failing to provide that, particularly if the content is just fixed text and objects, misses the point of HTML, the point of the Internet, and amounts more to ignorance and ineptitude than skill.

There certainly are exceptions, where actual FUNCTIONALITY and even content is provided by the scripts in a manner that cannot be done any other way – Google maps is a great example of this… But when all you have is a page of 3k of static content with 5 content images – there’s no excuse for the page to break if your goofy half-assed animated crap breaks the content when it fails. One of the basic rules of javascript for normal pages; enhance functionality, don’t replace it.

… and it’s not that hard to do, and usually results in less code, better use of caching models, especially if you realize what idiocy most “frameworks” on the Internet are; be they Javascript OR CSS.

When it comes to JS, I don’t think I’ve ever turned it off, and I’ve never hit any sort of exploit, or any page that produces some sort of visual horror. Perhaps I should start visiting porn sites? :wink:

While I repeatedly come across websites with broken/buggy scripts, scripts that are just outright annoying with the animated crap that breaks even simple things like forward/back, opening in new tabs, or directly linking to a section of content… and we’re not talking porn sites here… Lands sake I’m becoming increasingly concerned over how it seems like the porn industry is more reputable than most of the fly-by-night sleazeball scam artists out there developing sites for people.

There will always be less capable devices – every blasted year for the past DECADE we’ve heard “oh nobody uses that resolution anymore”, or “nobody turns that off” – then some new device - like a netbook, or a smartphone, or a pad – comes along and is a pimp-slap to everyone who said that. Same goes for the “oh bandwidth keeps increasing” which gets a backhanded slap with people browsing on bandwidth throttled or metered 3g connects, pay as you go metered caps on home connections (which I’m sure all our Canadian, Kiwi and Austrailian friends would be MORE than happy to explain to you) or entire counties in the US where 33.6 dialup is STILL a good day… Like Coos County NH, most of western ME, the Dakota’s… Those are the people who turn things off, resort to things like Opera turbo (one of the big selling points for Opera mini) and in general aren’t even going to bother repeat visiting any website that sucks down half a megabyte in scripting ALONE to deliver 2k of plaintext and five static content images. Say hello to Mr. Bounce Rate!

If there’s anything I’ve learned over time is that the scale expands BOTH directions, not just up. Just because you’re sitting at home with a 32" 2560x1440 Apple Cinemadisplay doesn’t mean the guy on the HTC One is going to have the same experience on a website. Again, see why “what you see” should never be expected to be “what the user gets” when it comes to the internet. (and why WYSIWYG is a steaming pile of /FAIL/ alongside px for content fonts and fixed width layouts)

… and it really boils down to practices that should be done from the START of working on a site, require no more effort – and as such failing to practice the entire REASON that HTML and CSS exist boils down to what Eisenhower once called “an apathy that had its roots in comfort, blindness and wishful thinking.”[/ot]

It’s like Jason said you have to consider that many users may be operating in contexts very different from your own when using a website. :slight_smile: