I think most people are used to the concept of iFrames by now, though I guess it depends how you visually implement it (providing cues would help). Either way the fact remains (back to the original topic) that it is possible to build a web application without JavaScript and as such, it shouldn’t be relied upon as the sole method of offering the service. This word processing script idea is proof enough that with a bit of common sense, a solution can be found (even if it is ugly).
There are only two things that can be relied upon at the browser end, HTML and CSS. Any client-side scripting and frames can’t be relied upon as they can be blocked by the user.
Your visitor can turn CSS off or override it with their own.
There is only one thing that can be relied upon at the browser end - that your content will be what is processed using whatever setup that particular visitor has in place to process it…
There are only two things that can be relied upon at the browser end, HTML and CSS.
Should be
There is only one things that can be relied upon at the client end, non-frameset HTML.
The client may not be a browser. The client user may not be a human.
The client may not understand or see meta tags (such as meta refreshes, MIME types or charset settings).
The client may not be able to create a framed setting.
The client may not have a windowing system at all.
The client may not be able to process scripts.
The client may not be able to process images.
The client may be overlaid or used in conjunction with other software such as AT.
The client may not deal correctly with HTTP1.1 (but we hope it does).
I knew a friend back in electronics class who had an old green CRT and a 386 machine. He read mail, surfed the internet and participated in our local irc channel, all through the terminal, just because it looked cool. I thought it looked cool. NERD GAWD! In his memory I use greenvision colourscheme in vi but it’s not as awesome : )
So, now all web applications should work in a terminal?
I think this topic went too far already
No-one is saying it should work in those archaic devices, the point (which perhaps you missed) is that you cannot rely on your visitors having the minimum requirements you set. As such you need to build upon the basics at every level to ensure your website and applications degrade as gracefully as possible. Without scripts, without style and without consequence to the end user. Otherwise you are just as bad as the people who ignore accessibility entirely, after all it’s your visitors who are the ones who will be using your stuff on a (hopefully) regular basis, not you. Of course making your website work in every browser is asking too much, but the point still stands that if your users won’t have it, don’t ask for it. It’s not like everyone is as lucky as you in the technology they can make use of quickly and easily.
Everything has minimum requirements, and you just need to tell your user about it. During progress, support of old technologies inevitably dropped sooner or later just to move forward.
It is not feasible to support all browsers in real world anyway.
That’s garbage. As far as presenting information on a web page is concerned there’s no reason why all browsers can’t be supported. As far as allowing your visitors to enter information to send back to the server there’s no reason why all browsers can’t be supported. You might have something where someone needs to do 50 page loads from the server to achieve something that someone else can do with three mouse clicks but ALL browsers can be and are supported by anyone who actually knows what they are doing.
Dropping support for a browser means you no longer try to make it look and work identically to modern browsers. It doesn’t mean that you break your code so that it isn’t usable at all on that browser.
Properly written web pages and applications can be used in any browser. They just don’t look and function the same where the modern options are not available and it uses older fallbacks.
You are talking theory. In real world, many companies choose not to waste days or weeks of development time to support for example IE5.0 which has maybe like less than 1% of users. You can agree with that choice or not, but it is how it is.
Most companies probably want bother to support a browser older than ie6, if they do they would probably serve up a really simplified version of the site.
Crappy for the user who doesn’t have JavaScript enabled doesn’t necessarily mean crappy for the user who does. My point is that if you build a “desktop-like” application, it shouldn’t be expected to behave as a plain vanilla website. They are two entirely different animals.
It’s certainly possible to have active widgets and their equivalent alternate static content in the same page, but then you have a bloated application that tries to be all things to everyone and ends up fulfilling your own prophecy of being “crappy”. If you’re prepared to build an app that admittedly some users won’t be able to use, then you should be prepared to provide an alternate “low-fi” solution for those users. Whether both can be provided in the same page depends on the design. My advanced GUI apps have almost no static HTML markup in the page (and very little PHP beyond querying the database and returning JSON-encoded resultsets, for that matter), so in my case it would make more sense to create the static page separately from the DHTML app. Your results may vary.
Just remember that some groups of people who don’t have access to JavaScript have been known to sue when they find sites that they can’t use. Often the offending site can end up with a bill of hundreds of thousands of dollars for failing to comply with anti-discrimination laws.
Do you have the money to cover yourself if that situation arises.
Of course you may get lucky and just have the court insist on your creating a no JavaScript equivalent. Then you’ll only be out of pocket by the court costs.
In the real world most businesses don’t make websites, web professionals do (whether they were hired or work for the company itself). And in the majority of those cases, the professionals who build for the web aren’t narrow minded enough to punish a minority group on the basis that they cannot comprehend the idea of progressive enhancement. Simply speaking, if you understand your trade and know what you are doing, it doesn’t take any more time or money to make a website work in all browsers (just to a lesser extent to rarely in use ones). In fact studies have shown that professionals who embrace progressive enhancement almost always spend less time building websites because they don’t have to spend tireless hours tweaking their designs (backwards) and patching as they go. Simply put… if you honestly believe that making a website that’ll work browser wide and meets the needs of your visitors (depending on the devices and browsers they use) is either time or resource intensive, you aren’t doing your job right. And if you are ignoring the accessibility of your work (also inclusive of the usability requirements), you probably shouldn’t even be qualifying yourself as a professional. I have never had an issue at least making all of my work functional, it isn’t always the best option but it certainly doesn’t take an excessive amount of time. Making a website dependant on JavaScript because you aren’t willing to take the steps to ensure it remains functional to those without it enabled is either sheer laziness, discrimination or ignorance on the part of the person being negligent towards the client base.
Well said Alex.
A web site that doesn’t work when JavaScript is turned off is a clear indicator that it was built by an amateur rather than a professional since no professional would risk alienating their client’s customer base by doing that. After all they could end up being sued by the client for building a site that reflects badly on the client.
Do you really think I meant that CEOs of companies writing HTML code on their own?
So, what you are saying that make for example gmail work in all browsers requires the same amount of time as to build it for firefox use only?
If so, you probably never wrote any javascript code for firefox (for example) only to see the difference.
Actually I have written plenty of JavaScript, and I don’t find writing it for all the major browsers an issue at all. Because I build websites constantly I am well aware of each browsers “problem areas” and write code so that it functions correctly for each solution (if that requires a bit of backup code then it’s done autonomously - without thinking about it). Most professionals take these factors into account as they code so the time it takes is always equal (as it’s always considered and factored in). What you are suggesting is that it would be more time efficient to do a lousy job and cut out such factors representation on the basis that you can shave off a few minutes, which would be correct, but in the long term it’s not a viable option or even a credible way to conduct business (and as a professional it’s rather disconcerting that you would even consider it an option). Any decent coder worth their salt ensures that their code works to the required specifications.
PS: I really don’t think it’s the right attitude in ANY situation to throw away clients, actively endorsing JavaScript dependence is very insulting.
I did not say that you didn’t write any JavaScript, I was talking about writing JavaScript code for one specific browser only, to see the difference.
Exactly, and required specification does not always contain all browsers, does it?
Even if cross-browser issues are known you still need take time to write code (even if it does not take much time to write it), test it and support it. One testing alone takes time for each browser.
So your statement that it does not require any additional time is just simply not the case.
The purpose is not to specifically support older browsers such as IE5. It is to begin development so that anything is capable of using it, and then to progressively add on enhancements, such as CSS, JavaScript and other technologies. That is how progressive enhancement works.
Obviously you disagree with that way of doing things. I suspect that you are more closely aligned with people in the graceful degredation camp.
In my opinion, this discussion has become a holy war of progressive enhancement vs graceful degredation.
Since the JavaScript that GMail uses is actually garbage that only works sometimes in ANY browser it is fortunate that GMail actually works with JavaScript turned off. Google really ought to consider employing some JavaScript programmers since their applications all work better without the JavaScript.
So in this instance it is an example of the applications only being tested in one particular configuration of browsers rather than being coded correctly to work in all browsers or even for everyone using one browser.
The only plus going for GMail is that it actually does work without the JavaScript.
With those applications of Google’s that don’t work without JavaScript it is broken for a lot of people whether they have JavaScript on or off because Google don’t employ anyone with the first clue as to how to write proper JavaScript.
So Google is a good example both of sites that work without JavaScript and which can therefore be used by anyone and sites which don’t work without JavaScript (and which can therefore only be used by a subset of those with JavaScript enabled) simply because Google’s JavaScript hasn’t been written properly to work with all browsers (eg. I can’t get it to work with IE8 or Firefox 3.6 - both of which have their useragent set to match the Googlebot useragent simply because their script does an invalid test of the user enterable useragent field for specific values that have no real meaning or significance).
You really seem to have no comprehension of the term “progressive enhancement” (based on what has been put across), it has absolutely nothing whatsoever to-do with coding for an explicit browser. The entire concept is that you make a website which will work for every possible user (plain old HTML), you then layer on the CSS making sure its usage in a variety of commonly used browsers (as long as it works to a basic extent it’s still functioning, and if there’s no CSS you know it’ll still work as you tested using pure HTML). And THEN you layer on the JavaScript functionality ensuring that everything worked previously (using server-side scripting as applicable). What you are suggesting is backward compatibility… as in you test on each browser you want it to function on… this is a redundant technique very few people use purely out of the un-productive nature it implies. Having everything work at a basic level BEFORE you touch JavaScript is progressive enhancement. The point in fact is that you progressively enhance each thing a website offers knowing that when it’s unavailable, those without it still have a functioning experience. The very nature of what you are putting across says to me that you either have little experience coding (practically) or you have some serious flaws in your methodology.
PS: pmw57, actually he seems in favour of the (now debunked) backward-compatibility (as in doing everything at once and applying glue to all the stuff that breaks post completion). Gracefully degrading code accounts for the lack of components not working or being available (like JavaScript) as alternatives are given. The difference however is that with progressive enhancement you are pushing more boundaries by attempting low-support functionality, graceful degradation just tries for what is reasonably going to be available for everyone. It’s less a holy war of practices, more about someone using a deprecated methodology.