How far is it possible to future-proof a website?

IMHO, not very! Crystal ball anyone? :slight_smile:

In my experience, an average website ‘needs’ to be completely rebuilt after between, say, 3-7 years of maintenance and upgrades. Why? Examples:

  1. New platforms (iPad anyone?) come along that the website is not optimised for (and needs to be)
  2. Standards have moved on
  3. Design starts to look very dated.
  4. New technologies come as standard with new systems, but are expensive to retro-fit to an old framework.

I don’t think many people investing in a new website design and development project today are expecting to have to do it all again, and re-invest similar amounts, in 3-7 years time? Actually, I have seen one or two more enlightened/experienced clients of ours state this as their expectation for their new sites, but that attitude is (sadly) rare!

So what defences/methods do we have to extend the worthwhile lifespan of websites we’re building today, i.e. before diminishing marginal returns mean that future investment is better spent in rebuilding a site from scratch? Ideas:

  1. Make the site compliant to the latest standards
  2. Stick to mainstream technologies (and hope that they’re still mainstream in 5+ years!)
  3. Ensure that all maintenance/upgrades on the site are quality implementations, and not hack jobs
  4. Ensure that sites are, if relevant, pro-actively patched (e.g. Drupal or Wordpress modules)

Any other ideas/suggestions/comments/observations on this issue?

p.s. Reason for raising this: I’ve got a client asking my company to implement a whole range of new features onto a site that I personally (oh dear!) coded about 9 years ago. I’m amazed and stunned that it’s still in active use today - but this is probably more a lack of budget on their part than a reflection on my awesome future-proofing coding at the time! So I’m trying to convince them to re-invest in redeveloping it from scratch, rather than throwing more money at an old system on features that are standard today in off-the-shelf packages.

In general, standards are pretty future proof. Heck, html 4.01, xhtml 1.0, CSS 2.1 have been around for ages and will still work just fine well into the future. HTML 5 won’t be properly finished—we’re told—until 2022. And HTML 5 and CSS3 won’t invalidate their predecessors, which will continue to work just fine.

Probably one reason why older sites are so out of date and hard to upgrade is that browsers of a decade ago were so poor at supporting standards that sites were built on poor foundations. These days, with easy-to-edit templates, separation of presentation and structure etc, it should be relatively easy to update sites into the future. Even if things like jQuery are replaced by something else, it should be easy to rip them out and plug in whatever follows… assuming the site is properly built in the first place.

For that matter most web sites are still using HTML 3.2. Many will never switch to HTML 4.

Thanks for your comments.

You both refer mainly to client-side technologies. What about server-side technologies? Example: one of the oldest and largest e-commerce websites on the planet. I’d love to have insider knowledge about how much has been re-coded over time given that, presumably, the original code base could only take the site so far, so then it probably had a major overhaul (behind the scenes) to allow a whole new wave of expansion.

I guess what I’m fascinated by is whether or not it’s inevitable that major websites will always, one day, need re-coding from the ground up because it simply become too costly to carry on maintaining the old/original code base. And if this is true, then what a lessons should be learnt by server-side developers such that they can steps to maximise the longevity of a website before a major re-write of a website becomes the longer term cheaper option. I assume the answer lies around ensuring that the code base is very high quality, i.e. well documented, very flexible (probably object-oriented) and that upgrades and maintenance to such sites are also carried out to the highest of standards.

fyi A long time ago (late 80s, early 90s) I used to be an analyst/programmer for a major international investment bank in the UK and of course in this massive mainframe environment there were huge amounts of checks and balances in place before any code would reach the live production environment. And yet plenty of poor quality, poorly optimised code still made it there because as far as the users were concerned ‘it worked’. I never got to work anywhere near the core systems (thank gawd!) but I think a lot of it was still written in COBOL - and probably still is today because it’s probably just too ‘core’ to risk messing with it.

Sorry if I’m meandering a bit in this post. Don’t mind me. I’ll just waffle on!

With server side technologies you need to update the code to keep up with the latest version of the language.

For example a lot of PHP scripts written for PHP 3 have failed in the last year or so as the servers have been upgraded to use PHP 5 and so some of the PHP 3 commands used no longer exist. There was plenty of warning that they wouldn’t be in PHP 5 because PHP 4 had them deprecated.

spaceman, if I may be so bold as to reduce the answer into a simple philosophy, for me the easiest way to ensure your website is as future proofed as possible is progressive enhancement and graceful degradation. The idea behind the terms is to essentially not code your website for a particular browser (they won’t be around for ever) or for a specific set of technologies (they evolve) but to ensure your website starts with something all browsers will understand (the older but reliable syntax) and has increasing levels of modernism layered on the top to those browsers which can handle it. And when those technologies are used, if the functionality is disabled the browser can fall back onto the previous incarnation of the site which functions but perhaps isn’t as flashy.

To put this into an example, say you have a rudimentary HTML page that’s supported by all modern browsers, you can layer on CSS that is fluid (so works in various PC resolutions both in scale and it’s boundaries), you could then also provide fixed styles on top which determine based on the viewport width how content could be enhanced or restyled (giving small screen modern devices a better go at the task), and you could also layer scripting behaviour on top to further increase the sites functionality. It’s all about layers, something that works for everyone with stuff that works for most people with stuff that functions for some people with bleeding edge stuff which will is cool for the odd person, all layered in an order of priority with fall-backs for everything. That’s as close to future proofed as we can get, after all, no support for a technology lasts forever. Another key point worth mentioning is that websites should evolve all the time, they shouldn’t be “done” and left to work in the future. Most people forget about evolution and don’t think to the future… rather a shame as designs aren’t like buildings, they aren’t more appreciated as they age. :slight_smile: