I think, in some sense, we are in somewhat of a scripting-dependency backlash, and it’s probably a good thing. While ARIA roles and other accessibility features can help, dependency on scripting for displaying simple content seems to be taking things too far. Developers seem to be using flashy new tools just for the sake of it, and not because they are solving an actual problem.
Do you have any thoughts on this? Are progressive enhancement techniques too old-school? Or is the scripting-dependency backlash a good thing?
Browser plugins can inject alterations that get in the way
The CDN can be unavailable
ES6 has compatibility issues
The whole school of progressive enhancement provides a whole new level of elegance on top of our craft. Anyone can put up a website, but providing the content first and ensuring that it’s capable of working in the great sea of forms of access and capabilities that are available - that’s a trick that we’re still learning to master.
Mostly it just takes time and effort to develop the muscles to achieve this, which is why people turn to frameworks so that they don’t have to do as much thinking. Coming off those frameworks helps us to exercise those muscles once again.
To me a lot depends on if they’re being used as a tool or a crutch.
If I had to churn out websites in rapid succession then I would consider using a framework as a tool to use as a time-saver.
Unfortunately, as Paul said, it seems many use them “to do what they don’t know how / want to do”.
Technology changes, and those that use frameworks as a tool should hopefully be able to switch to some other way (albeit likely not without some grief). The problem will be for those using them as a crutch, because they’ll need to either learn what they should already know, or find something else to lean on.
Hm, if only it were that simple. Every second time I go to Facebook or Google services the JS doesn’t load, for some reason, and no amount of refreshing fixes it. I have no idea why I get such a bad connection, but still, most of the content I miss out on need not be due to JS.
I guess in a way. I could consider it more of a fallover than progressive enhancement. Discourse is more of a fallover, than progressive enhancement (or even graceful degradation). It is definitely an afterthought.
I can understand that, but I also don’t feel that it should have an effect on the end result. By working from the base level up, then you’re effectively limiting what your app can potentially be. Whether you meant to or not or whether you try to let it limit you or not. It will be. No matter what. Especially if you’re considering using a MV* framework, you’re app is probably meant to be highly functional. You’re punishing the many, for the actions (or problems) of the few.
And if you’re going this far, why not support older IE’s?
Here is a snapshot from last month’s stats on the app I work on. It’s based on 10’s of thousands of unique users, all in corporate environments across different companies.
If you’re worried about corporate firewalls, why is there such a push to drop IE8 support? Because… you can’t move into the future by providing support for everything in the past and every possible curve-ball your users throw at you. The funniest thing is, a lot of the places I’ve seen that push hard for progressive enhancement, do not provide any support for < IE10.
CPU manufacturers keep pushing the envelope, trying to create product that outperforms competitor product, bringing us closer and closer to the physical technology limit (at least until new technology replaces current technology.)
Restaurants keep formulating new recipes or combinations, trying to create dishes that will be more appealing than competitor dishes (like the ‘restaurant wars’ mentioned in “Demolition Man”.)
Beverage manufacturers keep introducing new ideas into the liquid refreshment market, trying to make sodas/teas/beers that will be more appealing than competitor beverages (the ‘cola wars’.)
What do all of these have in common with ‘scripting-dependency’? Consumer drive. Whatever is more appealing to the masses drives the direction creating things for mass quantity consumption. Sadly, this means that a lot of the empty flash and pop that so many sites/apps are integrating is because it is thought to be more appealing to a vast majority of the target demographic; any complaints are generally few and are drowned out by the shouts of demand by the majority.
Then there is a UK specific post from 2013 that shows about the same:
I use to believe this is important but after working for several companies all which could care less it is just easier not to care. Honestly, as mawburn said it just isn’t worth the trouble when you compare the numbers. That being said optimization can play a huge role in using a single page application. I rather have a faster site for most of my user base using something like angular JS with a rest api than a something much slower but also works for less than 1% of users. The fact of the matter is a single page application will always provide a more fluent and fast experience than something that always has to keep reloading the page which is one reason they are hot right as web applications become more and more complex and interface with multiple technologies and vendors.
I think that’s what it all boils down to. Web development is essentially a trade, like plumbing or landscaping. Those trades could all have standards movements equivalent to web standards, but in reality, it’s still going to be hard to find a tradesperson who is going to bother to strive for excellence when most customers are looking to pay as little as possible. As a customer, even when you are prepared to pay more for a good job, it’s hard to find a tradesperson who won’t just do the bare minimum and rip you off.
As mentioned above, that’s not really the issue—though it’s really hard to get this message across to those who don’t want to hear it. I got into web design during the renaissance of web standards, and it’s really sad to see it all go down the gurgler so quickly.
Progressive enhancement is the best way to code for the web. It always has been. If you’re using mobile first, that’s a PE technique.
The comments above reject PE because a tiny proportion of users disable JS. That’s not the issue. The problem is the plethora of devices we have to support from feature phones to screen readers to tablets to PCs. It’s impossible to test everything but, with PE, you don’t need to. You’re writing defensive code; if any aspect breaks or fails, the user will still see something (presuming HTML is returned). PE is no more effort than whatever you’re doing now. It’s a way of thinking - not a technology. There’s absolutely no reason to reject it.
Sites I built in 1996 still work today. Of course they’re awful but they’re usable. Will you be able to say the same thing about an Angular-powered site in twenty years? Or even five?
So yes - we are seeing a scripting-dependency backlash. And it’s about time.
I’ve got to agree with ceeb. Progressive enhancement is still the way forward after all these years.
I have developed my own Framework (Cliqon) which does the latter. My eldest son and grandson, both programmers, were trying to tell me that the former was the best approach.
I knew them to be wrong but wanted the opinions of other Programmers on Quora about the matter. Every single respondent supported my way of doing things.
So from my point of view my response to your original question is straightforward. I consider your question to be ill judged and if required to give an answer, the answer to it would be absolutely NO.
I’m not a professional web developer so my view is from the consumer standpoint. It just seems common sense that you want maximum access to your site especially if it is a business. Many sites now cause my iPad to crash or become so sluggish as to be unusable. It is very frustrating and, as I do understand the web a bit, I can’t understand why. It worked perfectly only 6 months ago. I understand that it won’t be able to run the latest apps/games but I know that there is no real reason why I shouldn’t be able to buy something over the web.
Where I’d go from here though is to argue that due to the progressive enhancement manner in which they were built, that you can today fairly easily make further improvements to them to meet today’s demands.
Is there anything that would make you think they wouldn’t?
All the tacky JS animations and flash I made in Swish, still work fine 15yrs later. Why would things built on the current base of standards (JS) not?
Will I like what I’ve done in 5 to 10? Probably not. But they will still work.
Yes, there is at least double the effort in your description. I now have to make sure the server and the client can render the information in the exact same exact way. These are 2 entirely different processes.
Not to mention, what if I have other pieces being updated by JS? Usually a page that updates by AJAX, is being updated in multiple ways. So now, not only do I have to make your new Next request work identically on the Client and Rendered by the server, I have to make sure all these other little pieces have done the same.
Now you’re talking about even more things I have to double up, which has now turned this Next page into an engineering headache.
I mean, you can nonchalantly say this isn’t more work. But I sure you, it is absolutely not a trivial amount of extra work.
I agree with this. If the site is content based, where UI/UX gets pushed to the side, then it should stay content based.
All I want to do on a blog post is scroll up and down.