Unobtrusive JavaScript: Now it has come to this!

Continuing the discussion from When back button clicked redirect to URL?:

Hi all,

this is getting mentioned now and then, so I thought I’d start a new topic. :-) Until a couple of years ago I used to browse the web with JS disabled by default myself, but since JS increasingly seems to become an imperative requirement for a half-way decent UX, I eventually ditched old NoScript. There are plug-ins like disconnect which selectively block unwanted JSs… if not as reliably, naturally.

So anyway, in times of ubiquitous SPAs which plainly don’t work w/o JS (facebook, instagramm, soundcloud… just to name a few), how much do you as devs still care about unobtrusive JS? Do you consider no-JS fallbacks dispensable, a vital requirement or just good practice?

Personally, I do think that you can confidently expect client-side JS to work. Application logic is inexorably moving towards the browser. – Yes you can write your app “server-side-first” by serving static HTML files as default… but no, deadlines are tight and I’d rather groove the angular-part than take care that the page works properly w/o it. (Edit: Please keep sh!t5t0rm to a minimum… [cough])


Discuss. :-)

Cheers
sebastian

4 Likes

It seems a new generation of front-end developers has hit the ground running largely unaware of the standards movement of the 2000s, which is a shame. It’s also a shame that discussions around progressive enhancement too often focus on whether or not the end user has JS “turned off”. When that’s the focus, I can understand why many developers think “to hell with people who have JS turned off”.

But progressive enhancement is not about people having JS turned off. I have JS enabled in my browser—Chrome, on a pretty new Mac. However, I often find that JS doesn’t load properly/fully when I visit sites. Maybe it’s my internet connection—I dunno. But the fact is that JS can—and often does—fail. You can’t be sure it will reach the end user intact. So the question then is, what are you—the developer—going to do about it? You can either set up fallbacks in case of malfunctions, so that end users will still be able to access content, or you can just ignore the problem as being too hard, not worth your while or not within the budget.

When I got into web design, one of the most inspiring aspects of it was the concept of progressive enhancement and the accessible web. It’s sad to see all that going down the drain now. I understand that today’s app-focused web is heavily JS-dependent, and I can understand why there’s a reluctance to worry about fallbacks. What I’m less tolerant of is sites whose content relies on JS completely unnecessarily. That’s going too far, imho, and simply bad web design. I often end up on sites that are just a blank white page, and I not only give up on the sites but also spell out a little curse on the developer who served up this hopeless web experience.

Please don’t anyone turn this into a thread about having JS turned off. It’s an advertisement that you don’t understand the real issue here.

5 Likes

My definition of progressive enhancement has changed, to me it’s more about a sane layering of technology, using the simplest tech that is sufficient. I’m only opposed to using js when there’s better simpler options available.

There’s a lot of experiences that only make sense with js, so that’s the right tech to use. e.g. if the best UI to use involves heavy of use of drag and drop then there’s no reasonable fallbacks but that doesn’t mean that you can’t present a static view that people can interact with, it just may not be as fully featured.

The time I find JS being used incorrectly is when there’s 10 lines of js to do something that HTML or CSS already does much better, “js for everything!” often leads to unnecessary complexity. PE can still add value to the design of JS rich components if that means using html and css correctly, It still has value there for making the simplest solution.

I agree there’s not much value supporting the case where people choose to disable js anymore, the case whilst the js is still downloading or there’s an error are still reasonable cases to keep in mind.

4 Likes

while you can have JavaScript on or off there are also a huge number of steps in between.

Any JavaScript (unless it is written for Netscape 2) will use commands that not all browsers will necessarily support. A lot of people will have JavaScript enabled but not at the level needed to run specific scripts.

Depending on what code you use in the script there will always be visitors whose browser doesn’t support it.

There are also lots of sites with broken scripts to encourage people to turn JavaScript off and not everyone knows how to turn it offjust for specific sites.

6 Likes

I find that with evergreen browsers this is becoming less of an issue. Of course there’s some cases but generally it’s much easier today to support a baseline of browsers with a modern javascript runtime. Chrome helped massively here.

1 Like

It’s interesting you think using angular with a rest service is a simpler approach. In my experience it is more complex because it requires two separate websites that possibly run on two different stacks of technology. I don’t think it’s simple at all. It’s much more complex. Especially for one person.

I was building a personal project with angular and Laravel and just kept on getting bogged down by the complexity of managing two stacks of technology. I started rebuilding the project in laravel alone and my development is going much faster and more focused on getting things done than it was needing to know both Laravel and angular.

I think people just claim it is simple so others buy into it but it really is not for realistic projects. My experience has been it’s an unproductive pain in the ass.

It’s not without its merits but it isn’t very simple either. Especially when dependencies and start failing and what not on that damn node stack.

2 Likes

It’s true, server-side stacks are much simpler and faster to build than the very best client-side setups, they are just less powerful which is why many people start with the client, you may not in fact need a highly responsive UI in every part of the app though.

Many devs starting today will only over deal with js powered UI’s connected to API’s so they’ll use what is available and commonplace, not what is simplest. There’s the very real case of offline-capable mobile and progressive web apps that are inherently reliant on JS, server-side powered apps are not well suited there.

2 Likes

I didn’t say that! ^^ It’s just as @markbrown4 says – when developing a site, you might have to rely on JS to meet certain requirements. So you’ll build a RESTful API with say Symfony or Laravel and do much if not all of the view logic on client side for features such as lazy loading, building your site around a JS framework such as Angular. At this point you might quickly end up basically having to develop two sites if you want it to work w/o JS as well. It seems that many don’t bother (and I’m getting less and less sensitive to this myself).

This is indeed it. And I think that many modern approaches nurture that attitude. As for the possibility of JS to fail, you’ll certainly have to account for that if you’re building, say, an online banking site. (In fact, I’d probably not use JS at all for processing the data in such a case.) But other than that, I the web is undeniably becoming increasingly JS dependent. And well, of course it has economical reasons.

True! With disconnect many comment sections don’t work by default, for example. Saves me the trouble of having to block the comments manually. :-D No seriously, good point!

Thanks for all your replies everyone! :-)

I couldn’t agree with you more, and I find it very sad.

Again, those are my sentiments.

The first thread I read this morning was from a new member struggling with a site-builder site. I went to have a look at that site, to see if I could find a “Help” or FAQ section - and was met with a blank page. Really? A site-builder site which can’t display anything without JS?

To be clear, I have no problem with sites which use JavaScript for progressive enhancements, or for widgets and other things which clearly require scripting. I have no problem with messages like this:

There is a reason why JS is required, and it’s explained. What I object to is the growing number of sites which tell me “You must enable JS to use this site”, or present a blank page, or a page with a distorted and unusable layout. That is just poor practice, IMHO.

To add to @ralphm’s scenario of JS not loading correctly, there are also people using work or other shared computers who have no control over whether or not JS is enabled. Telling them to enable it to use the site is futile; they can’t. And then there are those who have chosen to disable JS. Presumably they have all done so for a reason which seems sound and valid to them, whatever developers may think of it. So telling these people that they “must” enable JS, without at least letting them view some content and judge whether or not they want to do so, is probably also futile.

Imagine you enter a store, and find yourself facing a blank wall, with a blank door marked “you must remove your shoes and socks to enter”. Would you do it? Maybe you would, if they had something you really wanted to purchase, but you can’t see a thing, so how do you know? If they at least had a window so that you could see what’s available, you could judge whether or not you wanted to be bothered going barefoot for a closer look, but as it is… If you wouldn’t build your bricks-and-mortar site like that, why would you do it with your Web site?

Not everybody has broadband; not everybody has a good connection with unlimited bandwidth and wants to download Mbs of unnecessary bloat.

When I build a site, I build it with the aim of getting the information on that site to as wide an audience as possible. So yes - where I’ve used JS, that means providing an alternative. A slide show might be replaced by a scrolling <div> - or if it’s purely decorative, by a static image. But it won’t break and spill jumbled images over the rest of the page content, as so many sites do. I sometimes wonder if part of the problem here is the “plug-in mentality”. You add a plug-in to your site, it does what you want and you forget about it and move on; it requires little thought, and somehow the idea that “the plug-in takes care of that” negates the need to think about what happens if that plug-in fails, for whatever reason.

So yes - I still consider no-JS fallbacks to be both good practice and a requirement, as far as basic access to content is concerned.

5 Likes

I was under the impression that there was a relatively recent push for JavaScript to be the main implementation of data instead of HTML/PHP. Hence why a lot of the more interactive sites literally self-obliterate into whiteout when JS is disabled.

It also doesn’t help that when disabling JS on a site, the default CSS and PHP looks degraded as if they didn’t put much effort into what a normal HTML website should look like.

I feel this is more of a case of the developer either rushing the job at hand, being lazy, or they are using some kind of hand-made framework they never bothered to improve upon.

1 Like

I was going to ask for clarification and possible examples but just remembered how Gmail looks without JavaScript. Although I must add that Gmail is far quicker for slow internet connections and agreed it does looks “pasty” :slight_smile:

1 Like

My feelings are that a site should be Progressively Enhanced unless maybe it can be assumed the target audience will or must need the “extra”.

That is.

  • start with bare HTML, NO CSS, NO images, NO JavaScript,
    just text in semantic mark-up. If the page is still usable (though perhaps clunky and ugly as sin) - Good
  • place some images in the content in “strategic” places, making sure they have alt values please
  • make it look better with some CSS, Good HTML can do wonders, but one has to admit, most want and expect a web page to look more like a glossy print page than a paperback page.
  • similarly, most want and expect a web page to be more like a television image than a glossy print page. For those, give it to them when possible with JavaScript.

But I think it all depends on the target audience as to what can be available and what should be demanded of them.

Similar to TechnoBear’s mention of CodePen, if a site is about high quality photographic images, it would not be unreasonable to not bother making the site look good if images were disabled. If a site offers online Flash games, it would not be unreasonable to require Flash to be enabled.

2 Likes

When I was last looking at Gmail it was their badly broken and completely unusable JavaScript that was my first concern - there was no way to get it working with JavaScript so I turned JavaScript off. As they hadn’t provided a fallback that makes their site with its prehistoric JavaScript completely unusable unless you specifically have user controlled settings set to values that they recognise rather than being able to set them to whatever you want as should be the case with uservalues.

I don’t think is a case worth considering anymore, people in these environments(if they exist) must be used to half of the internet not working for them.

There’s also the important distinction between web sites and web apps, I agree that all sites should be progressively enhanced but apps generally require a lot of js to have a modern responsive UI that is expected.

2 Likes

I don’t really understand what you’re getting at here, Gmail is one of the most successful web applications of all time, if it was as difficult or broken as you’re suggesting it would not be. Suggesting to make Gmail work as expected without js would require what @m3g4p0p is talking about - building two completely separate apps. They would have correctly figured that the cost is not worth the benefit.

Even if they made it work the experience would be so bad that it would still be “broken” e.g.

Why can’t I make my text bold?
Why is it 20 times slower than usual?

Gmail without JavaScript makes no sense.

5 Likes

they are relying on people leaving certain browser settings alone so that their prehistoric JavaScript will work.

I have changed those settings to values I need for something else which breaks their script (even though it wouldn’t have if they’d written it using a version of JavaScript appropriate for Netscape 3 and later).

I realise that most people don’t change those values and therefore Google’s code works for them - I am simply using it as an example of people who have tried to write JavaScript without having any idea of how it works result in scripts that are broken for at least some visitors.

Could you be specific about which settings you’ve changed, and what you’ve changed them to please? I’ve seen you mentioning problems with Gmail before, but I have to say I have used it on a daily basis for many years and never had a problem, so I’m very curious to know how you broke it.

3 Likes

It is not often but on the rare occasions Gmail refuses to load, it is necessary to load the vanilla version, which it does very quickly. Emails are then available I am delighted to say :slight_smile:

If it was important to the masses websites would be less reliant on client side programming. However, it is not. Getting caught up on these things is a waste of time only important to a few elitist that don’t matter. The people building the next generation of websites are buildings ones typically heavily reliant on client side scripting to function. It’s been trending this way for a long time. If you don’t like it stay off the internet. Being caught up on the best practices of a decade ago that have since been mostly obsolesced isn’t productive.

Since 2005 here. Probably one of the best running, most stable, web apps I’ve ever seen.