Great, I didn’t know that some of my doubts about how images are displayed are actually documented in the specs! But in this particular case I think you are partially right and partially wrong. It is correct that an image by default is an inline element and does not accept dimensions but it becomes a replaced element when the element already has intrinsic dimensions (e.g. from the dimension attributes or CSS rules) and the user agent has reason to believe that the image will become available and be rendered in due course - second case in this paragraph. So images waiting to be downloaded, to my mind, should accept dimensions (but broken images shouldn’t as you observed). I think this was an unusual combination of initial height: auto
followed by setting dimensions in js that threw Firefox off.
I haven’t thought of this use case but you are certainly right and therefore broken images or those not meant to be downloaded certainly should stay inline and not accept dimensions. From my tests all browsers except IE and Edge follow this behaviour and these experiments of mine with js do not affect it.
This is a good point, we shouldn’t cause slow downs for browser resizing (I’m also a ‘resizer’!). This would need some testing to know how much this script affects the speed. However, just for fun I put the for
loop in my script in another for
loop with 1000 iterations and I noticed only a slight slow down in browser resizing, for 10 or 100 iterations I didn’t even notice - this would indicate there is quite a lot of headroom. And still, the slow downs don’t affect main browser window resizing since modern browsers handle content in a separate process so only a few fps drop might be noticeable within the browser window.
Anyway, wanting to improve it I’ve come up with an idea that we could cut down on performance penalty if we use js to resize only images which are waiting to be downloaded - those that are fully available can be safely delegated back to pure CSS resizing with height: auto
. This would go like this:
<!DOCTYPE HTML>
<html>
<head>
<title>Image</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width,initial-scale=1">
<style type="text/css">
body {
margin: 10px;
}
.img {
outline: 1px solid red;
}
</style>
</head>
<body>
<script>
function setImgAscpectRatio() {
var imgs = document.querySelectorAll('img.img:not([data-loaded])');
for (var i=0; i<imgs.length; i++) {
var img = imgs[i];
if (!img.complete) {
// image not loaded: resize with js
var ratio = img.getAttribute('width') / img.getAttribute('height');
var maxWidth = img.parentNode.offsetWidth;
var targetWidth = Math.min(img.getAttribute('width'), maxWidth);
var height = targetWidth / ratio;
img.style.height = height + 'px';
img.style.width = targetWidth + 'px';
} else {
// when image is loaded apply pure CSS resizing from now on
img.style.outlineColor = 'black';
img.setAttribute("data-loaded", "1");
img.style.removeProperty("width");
img.style.maxWidth = "100%";
img.style.height = "auto";
}
}
}
// set image sizes when page html is partially loaded:
var arInterval = setInterval(setImgAscpectRatio, 1000);
document.addEventListener('DOMContentLoaded', function() {
clearInterval(arInterval);
setImgAscpectRatio();
setTimeout(setImgAscpectRatio, 0);
window.addEventListener('resize', setImgAscpectRatio);
window.addEventListener('load', setImgAscpectRatio);
});
</script>
<div>
<p>leading text content</p>
<img src="600x300.php" width="600" height="300" class="img" alt="My Image">
<p>trailing text content</p>
<img src="delayed-img.php?700x200" width="700" height="200" class="img" alt="My Image 2">
<p>trailing text content 2</p>
</div>
</body>
</html>
Live demo.
I set outline colour to black only to indicate the image has been put back to CSS resizing. This should have very minimal performance penalty once the images have been downloaded. There have a been a few issues I needed to deal with, for example IE doesn’t fire onresize events before all images get downloaded so js resizing does not work then - so I added the window.onload event just to ensure things are displayed properly in the final stage. I also had to add setTimeout(setImgAscpectRatio, 0);
in the document ready event to handle certain cases when scrollbars could have been erroneously included in calculation of 100% width of the parent element.
This could be further enhanced with throttle/debounce techniques if necessary (but I think that would be extreme). I feel the more I dig into it the more subtle behaviours and issues come up to cope with and this also makes me think whether it’s all worth the hassle. I have a feeling I’m trying to code something that browsers should be able to do on their own with a one-line CSS rule…
Well, yes, I certainly agree that it’s good practice not to use dangling inline elements but this is a somewhat different topic altogether. Here I was rather concerned with using wrapper elements for the purpose of applying proportional dimensions with the padding trick. To keep html clean I want a universal solution, for example, I will usually have page content like this:
<p>This is some text.</p>
<p><img src="img.jpg" with="500" height="255" alt="my dog"></p>
<p>This is some more text.</p>
<div style="float: right"><img src="img.jpg" with="200" height="100" alt="my snake"></div>
This might come from an editor in an admin panel or some outside source and I just don’t want to be concerned with fixing the html just for the purpose of image scaling on small screen devices. I simply want this html to just work on any device without modifications. That’s why I said I don’t want to use wrapper divs for this purpose.