Full Circle in Web Development

Before the dawn before most had Internet (highspeed) Web Pages were created with a frame of mind that they had to load fast and only take up a few hundred kilobytes at the most. In brief the sunset of highspeed shined upon us and most, not all who were developing web pages, threw every bell and whistle on there page, there wasn’t really, if at all any bandwidth restrictions and everything went regardless if the design was horrific or not.

Fast forward to present and more and more Internet users have monthly bandwidth restrictions 2,8…etc gigs(Blame Telco companies & there greed). In my perspective we are going full circle back to the time when dial-up had the “it” factor and web pages had to be created with delicacy.

The difference now, we have a little more room to make web pages total size a little larger. Unlike the gap where every thing went in web development we face almost the same hurdle when web pages were developed on dial-up except now it doesn’t take time to load up a web page as it once did instead it reduces your monthly quota. We are once again concerned with “caching” pages for load times for page content downloads and some dev are making sure new content that has to be downloaded is as small as they can make.

Thank You.

I still follow Alex’s basic premise of building efficiently.

As download speeds have quickened, users expectations have got shorter. Way back in the days when 56k modems were the normal people were used to waiting 20-30 seconds for a page to load. Nowadays if they have a fast connection that waiting time has dropped to just a few seconds, any more and they use another site. If a page weighs in at 1mb (not uncommon) that still take several seconds to load on an average 8mb connection, if it were 100kb it’s load like a shot (better customner perception et al.)

One thing that I rarely see mentioned is that “heavy” sites use more disk space and data transfer, and thus cost more to run, especially on busy sites. Imagine what it would cost Google if they stuck a 100kb image on their home page, thousands, possibly millions(?) of dollars over a few years.

The myth of ubiquity - where an increase toward the availability of something causes heightened assumptions that the older issues are no longer relevant. It’s something that happens quite regularly (and has done) through the history of the web, though I guess the lesson is that complacency doesn’t act as a replacement for good judgement as to the circumstances and the needs of that audience. Speed has always been an issue for websites - that’s my main point in this case. :slight_smile:

Which myth is that?

Off Topic:

You mean decades if Jooooolia wins, right? >_>

As the internet has got faster people’s expectation of load times has also changed. Where someone on dialup might wait up to 30 seconds for a page to load (and hence the page could be a huge 80k total size and still retain half the visitors), those on moderately fast broadband will only wait four or five seconds (and hence the page can be a huge 80k total size).

Ideally pages should load in under a second and so until such time as everyone is using a superfast connection that can handle that sort of load time the optimum page size will remain at around the 30k to 40k range if you want it to load before people start looking elsewhere.

Also while some parts of the world may not have had bandwidth restrictions on high speed internet in the past, most of the world has always had those sorts of restrictions and if anything the amount of bandwidth you can buy for a given amount of money has been slowly increasing over the past few years. What bought me 15Gb a month three years ago now buys me 40Gb. Still limited to ADSL2+ speeds though and how long it will be before a faster optic fibre solution becomes available depends a lot on who wins the election in two weeks time - it might be a few years if one party wins and decades or more if the other party does.

Only once a significant fraction of the world has a superfast optic fibre internet connection will page sizes above 100k become practical except for where you are supplying something that your visitors want badly enough to actually wait for it to load.

I don’t think a full circle has occurred… I’ve never stopped being concerned with speed or efficiency because most people realise that 56K hasn’t disappeared, it’s still something that over 1/3rd of the world population have to deal with (in terms of speed - if not as well as the connection rate). Stereotyping to the Western world is just one example of where designers are failing to meet the needs of their audience, mobile phones as usual have been forced to give everyone a firm slap in the face that their eye candy isn’t always in the best interests of users (and those hideously large Flash files). If you think we’ve recycled, you probably fell for the myth. :slight_smile:

what SiberianHuskey says it’s like automotive industry comes full circles every time petrol price goes up!

you, as a programmer, have gain over these years, ways to make the use of web a more provident enterprise. let’s not forget AJAX, three layer separation: content, presentation, behaviour.

and don’t make the assumption all sites have a global audience. like restaurants, there are fast-foods and there are classy, demanding places you can’t get in without reservation.

your vision is a little bleak in a world of HD over internet. things will always go like this: some will have more, some less. i, my self, started surfing in the days of 2kb dial-up. lord only knows pictures were precious… even a 2/3 display instead of full display counted as good pictures! talking about flash, online play, online transactions was trash talk!

don’t let me start about accessibility to information, to forums such as this. oh, the pain, oh, the terror of a search! you should remember this (if you have lived it)! even if you wanted, you couldn’t pay for a faster bigger access. getting frameworks or kits those days meant multi-volume arcs of a floppy size! today, you have options, then it was only one way, one form, one size!

when i jumped to 152kb on my CDMA mobile… also paid a lot more! and i had to use a terrestrial antenna on my cell to boost the signal! man, the things you forget! there are things i forgot about my surfing history it will make a good comedy.

there is no way we’re coming to a full circle in web development. it’s only gotten more complex in regards to deployment, things to consider related to how to serve your content are taken over the actual work you do over your page content. diversity forces you to take account of much more many ways, needs and capabilities of surfers than back then.

today, as a web developer, you can choose to express you’re self either in an expensive form or in a minimalistic manner, options that didn’t exist in the beginning. 2GB is little say you? remember when 64MB RAM, a PII and a 4GB HDD was for gaming? it’s not that far back, as i’m not that old yet! :slight_smile:

today, as a web user, i have the privilege of choosing AMONG ISP’s, mobile or otherwise, to choose a connection, to choose the type of limit for that connection: speed or traffic. i can use smarter browsers to help me save on my bandwidth and traffic. i’d say it’s a different planet, not only a different circle!

Possibly - but certainly much sooner than the alternative where the overloaded copper network is retained and efforts made to try to squeeze that last 0.001% out of it and no plans to install anything faster as is proposed if she doesn’t.