HTML5 and the W3C Recommendation RuckusBy Craig Buckler
If you believe the reports, web users are shouting from their virtual rooftops and holding impromptu parties. There are also bizarre stories detracting from the announcement and making ludicrous claims such as:
- The status has changed despite bitter specification squabbles between the W3C and WHATWG.
- HTML5 is Steve Jobs’ legacy and would never have happened if it weren’t for his foresight (in banning Flash).
- The web is far from complete and HTML5 won’t be usable for another decade.
The question for developers: how does this affect our daily website building chores?
The answer: it has absolutely no impact whatsoever.
The reason is simple. W3C specifications do not tell browser vendors what to do; they record what has been done. This point continues to cause widespread confusion but the general process is straightforward:
- Vendor A implements feature X in their browser.
- Vendor B (and possibly C, D, etc.) applaud loudly and consider implementation in their browsers.
- Feature X is documented in a W3C specification.
- The implementation is discussed and refined. The specification status moves from Editors’ Draft to Candidate Recommendation and, finally, to Recommendation.
Vendors always have ultimate control. Features are not guaranteed to be implemented, usable or consistent just because they appear in a W3C document. Fortunately, while Google, Apple, Microsoft, Mozilla, and Opera are competitors, the process encourages interoperability because it’s impractical for developers to adopt an unstable HTML5 technology. If we don’t use it, there was little point in the vendor creating the feature in the first place.
Of course, some vendors have a bigger clout than others or can influence HTML5 in different ways. Google controls Chrome/Chromium — the market leader — so a feature is less likely to be usable until it reaches the browser. Apple is free to implement an iPhone-specific feature and do not necessarily care whether it becomes an official HTML5 technology because they control that device.
From a development perspective, none of this matters. The web is device agnostic. Building sites and applications is — or should be — a exercise in robust progressive enhancement. For example:
- You can use the new HTML5
<input type="date">control. Some browsers have full support and display a calendar. Some have mid-level support and provide basic date validation. Some have no support and fall back to a standard text box.
- You can adopt CSS3 techniques such as animation. Some browsers have full support. Some require a vendor prefix, which you can choose to omit, add manually, or automatically. Some older browsers have no support for animations, but you can ensure your application remains usable without them.
- You can store data on the client using IndexDB. You can detect whether it’s supported by checking
window.indexedDBexists and fall back to another solution such as server-side storage when necessary. The application may not be as polished on older browsers, but it’ll continue to work.
HTML5 becoming a W3C Recommendation is great news but developers have been using it for almost five years. Sometimes a technology has good browser support. Sometimes we decide to use a shim. Sometimes we fall back to lesser functionality or slower server-side processing. Sometimes we simply abandon an old, infrequently-used browser. The decision is a compromise based on technical, logistical and economic constraints. The W3C specification status, vendor competition and standards body politics will always have a negligible impact.
That said, those who’ve been irrationally avoiding HTML5 because “the specification is still a draft” now have fewer excuses. HTML5 is complete; let’s focus our attention on HTML5.1!