So you don’t want to be able to quote other member’s posts then?
It’s an interactive forum. Without the interaction, there is no content.
I think that’s what it all boils down to. Web development is essentially a trade, like plumbing or landscaping. Those trades could all have standards movements equivalent to web standards, but in reality, it’s still going to be hard to find a tradesperson who is going to bother to strive for excellence when most customers are looking to pay as little as possible. As a customer, even when you are prepared to pay more for a good job, it’s hard to find a tradesperson who won’t just do the bare minimum and rip you off.
That’s just plain insulting. I’m not planning on getting mired into this debate, but I wholeheartedly agree with @mawburn that it’s a dreadful waste of time to be supporting no-JS. Ensure that your JS works if you’re going to use it? Yes. Browser and version test it? yes. Ensure there’s at least a message indicating that JS is needed? Sure.
But your insinuation that you can’t be a good developer with that mindset, or that you’re doing a bare minimum and ripping off paying customers, is insulting. Unless a customer of mine specifies the need to support no-JS, it’s not a ripoff - it’s a modern viewpoint. And like any changes in technology or any other field, some people are resistant.
You definitely can be an excellent developer and choose - perhaps even explaining so to the client/employer/whoever - to not support no-JS.
Apologies for going slightly off-topic, but…
I’m not sure why you think that? From what I can tell, PHP7 is going to be a big step forward… better performance, smaller memory footprint, and apparently no big BC breaks. The PHP language and ecosystem are massively improved from where they used to be when I first started developing, and PHP7 just seems like it will continue that trend.
Now, I’m not talking about loading external content (twitter feed, facebook feed, etc). I’m talking about providing the base content of your site. You’d be amazed how many sales companies have lost from my wife because they were so js happy that they just wouldn’t load on her phone (which is supposed to have 4G connectivity - thanks ****).
And it’s a matter of market awareness. If you know that a lot of your customer base is using phones, and you know that they’re possibly not all on high speed connections, and/or you know they’re running without JS… then I guess you’d better take that into account! That doesn’t mean it’s a universal development policy though, that’s being a smart developer or business owner and knowing your requirements.
Edit: TL;DR - making absolute declarations about this sort of thing like some of you are is my problem resides - I agree that in some cases, your view is right - but not in all or perhaps even most of them.
To me, js/no-js falls into the same category as fixed width sites. Unless you’ve got a mission critical need for it, you shouldn’t depend on it because otherwise you can be chasing customers away.
Right, but that’s not what I’m arguing. I’m arguing against absolute declarations that not supporting no-js (or depending on js, however you word it) is negligent/wrong. Those blanket statements are the way people often try to win arguments or discussions “Yeah, but you guys just suck at your jobs if you say that” type of statements - that’s not logical, it’s just being insulting for the sake of disagreement.
And my personal opinion (because I’m willing to say it’s just that - my opinion) is that there’s little need for that type of backwards support, as @mawburn says, same as IE6 or something.
Sorry I keep editing
But there are always exceptions to any “rule” or declaration. The problems occur when people use the counter arguments to go against “best practice” (ugh, I hate that term) just because it’s quicker and easier, which is what I think Ralph was trying to get across when he made that statement.
As for the IE level of support, that all comes down to your stats. If your stats show you don’t need to support it, or the intended client base is more likely to be on target technology wise, but if you’re going for wider client bases, then you better be prepared to be more flexible in what you support. I’m not trying to say you need to serve them curved borders or canvas alternatives, but you should be able to provide them with a somewhat usable product, even if the functionality is limited somewhat - concessions do have to be made
True, exceptions have to be made to almost any rule. But a significant portion of websites out there may have target markets that don’t necessitate that type of support. As you say, it depends on your stats, and your target client market. So I think a blanket rule in this case is not viable without conditionals attached (“If you intend to support as many users in differing circumstances as possible, graceful degradation on some level is necessary” - would be a way better phrasing for such a “rule”).
Either way, I still disagree with what was said on social behavior grounds - basically just listen to people’s counterarguments to your opinion and then say “you suck” in more eloquent verbiage. That’s done in poor taste. Hopefully that wasn’t how it was meant.
In any case, I think we’re saying the same thing @davemaxwell and just not agreeing on how to say it. It depends on your market, and/or your stats. My belief is that there are too many variations to make a blanket statement - instead, there should perhaps be a set of “best practices” - I hate that, too - that give guidelines on varying sets of circumstances. How to best notify users when they need JS; how to gracefully degrade when that’s appropriate; How to tell if it’s appropriate; what stats to look at; etc.
Also, I forgot to respond earlier to another point - I think you’re 100% right on many extremely simple informational sites. There are too many that flat require JS that shouldn’t really need to - and it’s something I’m reluctant to judge on, because of my stance above. Maybe they feel that the need to have a positive impact on those that do have JS enabled outweighs the need to support non-JS, and don’t have the money to invest in a good UI for both? No idea. However, if it’s a small business advertising to basically anyone and everyone, and they simply need to display some information, putting any potential obstacle in the user’s path could be an issue. On the other hand, as mawburn has said, at some point we have to stop even considering that much - at some point, the technology should/will be? considered the norm and required. Not sure when/if that’ll be, or if we’re there now. I guess my thought is that as long as both developer and owner of the project know the risks and benefits… there is no best decision to be made for them; they make the best one for their case.
For me i like to try and build sites that are fast. waiting for JS to do something that a user won’t notice or will look at once and then ignore is not something i really want. If as above their is something only JS can do or it helps with accessability then i’ll use it. E.G i have a js delay on hover for my menu system so that people who can’t control the mouse very well don’t keep activating the dropdown by accident. If it doesn’t work its not a problem as the default dropdown will still work in css, just without a delay.
I try and build from a good base so if it does all degrade it goes to something readable at least.
To a certain extent it also covers my behind before my boss comes to me and says ‘why does this look weird on this random tablet/desktop/phone’.
Bosses don’t always understand that IE7 or non-js is only X amount of users, they just see it as people they are losing.
It’s evident you’ve never tried Progressive Enhancement. Render the page/Ajax result in HTML if you like. Use Node.js with the same rendering code on both the client and server. Even if you do it totally separately, you can use the server HTML as a template on the client and just substitute values. It’s hardly double the effort and it more than outweighs the benefits…
…such as your content is readable by a search engine. That’s a big plus point. Few clients realise their JS-powered site has just become invisible. (And, yes, I know Google understands some JS shennanigans, but it’ll never find it as easy as plain old HTML).
I’m not getting how this changes for JS. I’ve not seen anything mentioning ES6 or ES7 removing any backwards compatibility.
I haven’t found any that doesn’t and the site I work on still has lots of IE5/4 and Netscape stuff around. There are some pages, which were very modern in 1999 and full of JS, that haven’t been touched in about 10+ years. They all still work perfectly fine and are still used.
Some of that old code still floats around in some of the .js files that are used elsewhere. But, I usually end up removing it or replacing it when I see it.
Ok, how much work would it be to make a TODOMVC without Client-side JS?
Quite a bit.
Even more if you don’t want the functionality to be completely garbage. Because you’ll probably a complete redesign of how user input is handled. (A user shouldn’t have to refresh the entire page every time they want to insert, update, or delete an item for instance. So now you have to handle all the data at once, instead of in pieces.)
This is no longer graceful degradation or progressive enhancement, you now have 2 entirely different apps. Not everyone wants to support & maintain 2 entirely different apps that provide the same function to appease minuscule portion of users.
I think that both you and @mawburn could agree that no one cares if a site coded in 1998 still works. I personally don’t care if a site I code today still works (without any further work done on it) in 20 years. The Internet changes too fast for that. It will be an irrelevant question.
(And, yes, I know Google understands some JS shennanigans, but it’ll never find it as easy as plain old HTML).
The technical language and proof you present here makes me believe you! (In addition to being vague, this statement assumes that Google will never get better than it already is at this, which I can’t imagine how we’d be able to predict).
I actually am employed by an app that was originally developed from that time frame, with pieces from that time frame still in use. But they are used far too infrequently, their functions are far too important, and they are far too complex to mess with. It would be like a 6 mo, 2 man project to convert them to current stands because they are written in a custom C++ engine. They are used every day and what they are used for is mission critical, but not critical enough to justify the time it would take to bring them to current standards.
It sucks, but it is what it is. They still work fine. They even still have copyrights that say 1999.
Just a couple weeks ago, some of the dropdowns needed to be sorted. Instead of digging into the bowels of that C++ engine to figure out how to sort them based on the data structures that rendered them, I went in and slapped some jQuery down and sorted them on the client side. Worked like a dream.
I guess that it what it is, but if their use is actually mission critical… I’d take the time. If something went wrong that couldn’t be fixed and they’re actually mission critical, that’s definitely an issue
So, maybe I’ve fallen on my own sword about blanket statements but I’m sincerely hoping that using a 20 year old app in any mission critical way really is an exception and not a norm, haha.
Edit: Hoisted on my own petard might be a better term; falling on your sword is an honor-analogy I suppose
Indeed. Whenever this topic comes up, the justification for ditching PE is always that 1) if people choose to turn off JS then they get what they deserve (which is not the issue at all), and 2) that some apps are meaningless without JS, which is of course true, but also largely irrelevant, because the real problem is that there are many thousands (if not millions) of sites now that are wholly dependent on JS for no good reason whatsoever.
Well it is for that certain amount of users. There is nothing that is going to break in it any time soon. The HTML doesn’t meet full validation and it looks like hell, but it works in Chrome 42 just as well as it worked in IE4.
It’s probably half a million lines of code and each page renders close to 250kb of data without images.
Honestly, if we did go in and change it, it would probably piss off the users who do use it.
If something goes wrong, it can be fixed. But there’s no reason to bring it to modern standards. (some of the original developers actually still work here!)
Honestly, if we did go in and change it, it would probably piss off the users who do use it.
Isn’t that the story of upgrades?
What’s worse is when the users are the ones crying for the newer standards, better or “prettier” GUI, etc… and then when they get it, do nothing but reminisce about the old one
But that site does basically work w/o client-side js running. It’s uglier and not as functional, but it does work.
But too often, prettier often means more complex - just human nature - “let’s those these bells and whistles in. Why? Just because we can!”