Hiding background-image until it has fully loaded?

Hello,
In a site I am designing, the body has a real-nice background image that unfortunately is a big file. Well, I’ve made it smaller, but the server the site is being hosted on doesn’t have the best internet speed. So you see the image slowly loading.

As some can remember from the 90s, seeing an image load isn’t the prettiest thing.

I was wondering if it was possible to hide the image until it had fully loaded. I do not know or think, actually, if CSS can do all of this, but the reason I posted it here was because it is not an inline image, but the url value for the <body>'s background property in css.

Does anyone know how to achieve this or approach doing so. And if someone knows that it can be done mostly in javascript or another language then can they please contact a moderator to move it to that forum?

Thanks in Advance,
Team 1504

The only way i know of to do this would be to use JavaScript and tell the page to wait until the image has completed loading then write the property back into the <body> element, see the following example for a common method to accomplish image pre-loading.

[I]Code source: http://jsfiddle.net/vDmAe/2/[/I]

One of my core rules on a website, is that if a image feels to large – it is. It’s all part of what I’ve come to call “not viable for web deployment” – it SOUNDS like your image falls into that category… so use another image! You are probably using a massive image that’s basically pissing on your content and driving users away – so get rid of it!

Mind you, there may be compression or format techniques or other ways of building the image to help get it under control – but we’d have to see the image in question and the site it’s on to say for sure…

But, if an image file that’s nothing more than presentation (like a background) is too big, GET RID OF IT… just like how if your code starts to feel overcomplicated and needlessly complex, it already was a week ago… fits into the whole “but I can do it in photoshop” idiocy that results in bounce-monster pages – A background image isn’t content, so don’t let it ruin your ability to deliver the content.

Unless of course said background is being used as flash to hide the lack of substance.

If it’s a JPEG image, save it without the “progressive” option.

It is! Ill go back in Photoshop and do that.

what does that progressive option do?

ticking the progressive option means that it will load in parts and one will see the image ‘jump downwards’ until it is fully loaded.

@Victorinox, I tried un-ticking progressive and that still didn’t work. it still loads like before.

Unless of course said background is being used as flash to hide the lack of substance.

@deathshadow60: no, its just to set the scene. also, i consider background-images in mb to be way too big. This is less than half an mb.

Ill try the jsfiddle method.

@chris_upjohn with the js fiddle option, how do I know when the image is finished loading? isn’t it a different times per internet speed of the user?
is there a way i can load the image in the background while the page is loaded and then detect that the image is fully loaded and then style it?

The jsFiddle I posted does exactly that, if you view the code source you will see I used JavaScript to create a new image source then used an interval to check if it has finished loading which is the [B]img.complete[B] code.

While it may not look like the fiddle isn’t doing anything you should notice that instead of the background loading in segments going from top to bottom it loads in one go because it has been per-loaded into the page already.

Sent from my iPhone using Tapatalk

problem with jsfiddle is you’re taking something too big for web deployment, and throwing more code (and as such wasting bandwidth) on it.

You just said “less than half a megabyte” – that’s still almost four times the upper limit I’d allow for HTML+CSS+IMAGES+SCRIPTS in a page skin, and seven or eight times my ideal page target. If its more than 10-15k, I would be swinging an axe at that bit of design as not viable.

Something like this should work.


setTimeout(function() {
     var img = new Image();
     img.onload = function() {
          document.getElementsByTagName('body')[0].style.backgroundImage = 'url(' + img.src + ')';
     };
     img.src = 'pathtoimage.jpg';
},1);

That’s the most convoluted way I’ve ever seen to access BODY. Why not just…

document.body.style.backgroundImage = ‘url(’ + img.src + ‘)’;

… since that’s a static link to same and avoids the overhead of parsing the entire DOM and passing an array?

I have to agree with @deathshadow60 on this one.

Saving the image with progressive ticked actually seems to get it workingly like I wanted oddly enough. By default it is untiled and that’s how I saved image files all the time before.
And i’ve noticed this problem with large images, so I tried ticking progressive option and when I view the page the image is blurred and then fully loaded with proper focus.

However, if I were to chose a solution that was suggested then I would pixk the JS fiddle scripts that seems the most efficient and it works the way it should.

Thank you all kindly!

I thought “progressive” was like gif’s “interlacing”… that is, saving the image as progressive, like saving with interlacing, made the file ultimately bigger in size??

Not entirely, and the filesize depends on the image. Progressive can often be smaller on images that have low levels of detail like line-art – and be much larger with large levels of fine detail like photographs.

It’s not really ‘interlaced’ in the conventional sense of scanlines as it is multiple smaller ‘detail blocks’ – an example of progressive scan would be to send the entire image as 4x4 summed blocks (single color mean), then sending 2x2 blocks that contain ‘skew’ differences of luminance, then a final per pixel pass of any values that aren’t already at what they should be. Much like building a page with progressive enhancement, progressive video is taking a rough baseline and then adding more and more detail on top of it. If you have large areas of a single flat color, progressive will result in a smaller file. (just as in video progressive can use less bandwidth as like in animated GIF, only the parts of the frame that changed from the last frame are sent).

Interlacing just means “data every other” – like every other pixel row or every other scanline… that’s why there’s a difference between 480i and 480p in video.

Thank you for the explanation. I was wondering why that difference existed in video.

So and interlaced PNG is a PNG where every other pixel-line is loaded, right?

PNG interlacing works on both axis instead of by scanline. It gives you first 1 out of every 4 dots in a 2x2 grid, then it gives you the missing horizontal pixels on one scanline (shown in most agents doubled-height), and then the missing scanlines… or at least that’s my understanding of it.

Though support for interlaced PNG is spotty at best. Most browsers handle it after a fashion, but a lot of libraries for other software get confused by it. Due to the interlacing the zlib compressor usually doesn’t work as well, so there’s little reason to use png interlacing.

Especially since PNG compression of anything that really ‘needs’ a compressor is piss poor anyways – part of why I avoid 24 bit .png (or worse 32 bit png with that pesky alpha channel) like the plague.

i didn’t know one could export to 32bit png. hmm i would ask how one does it, but based on your description of it, i am thinking you wouldn’t want tot ell me / it is a bad idea anyway.

If you really, really must use a bg image of that size and you can’t make it smaller using photoshop, or whatever, then you could “preload” the image into a hidden <img> at the very top of the body element. Then put an onload function in that <img> to assign the img’s src to the background image for the body.

Image files in a <img> are downloaded in a parallel stream to the rest of the page content.

24 bit color png’s with alpha transparency are 32 bit – the moment you use alpha transparency you’re at either 16 or 32 bit… 8 or 24 bits of color channels + 8 bits of alpha.

But they are also big fat slow memory hogs that can make scrolling painful in some browsers (FF) and have browser support issues (IE) – that in my experience are rarely if ever necessary (“close enough AA” with palette transparency is close enough, or just pre-composite), and in the handful of cases where it is “necessary” to the design you tell the artsy fartsy “designer” NO!