What is a maximum page size?

How big a page size is considered a maximum? I have been wondering how many kilobytes I can have for images.

I know that this depends on the website. ie CNN, NYtimes, ebay provide ‘unique’ services so a larger size is tolerated.

But what about a general small business site that has competition? I know people will wait a maximum of 10 sec but do they still use 56k modems anymore?


Hi drakke.
What is your question actually about ?
Do you want to know a generally allowed page size,
as implied by your first sentence
or do you want to know the reasonable size of a single page (you don’t put all contents on one page anyway),
as implied by your last sentence ?

  1. Generally there is no limit in page size, it depends on how much space you bought from your ISP or, if you have got your own server, on how much space you have got there.
  2. The size of a single page in terms of loading speed depends, as you yourself mentioned, on the internet connections your visitors have got,
    which you can’t influence
    and the absolute page size itself,
    which you can influence.
    I suggest to use Firefox as a browser and get the Add-Ons “Firebug” and “Page Speed” of which the latter needs to have Firebug installed first.
    This tool gives you a good impression on how to improve the loading speed of you pages.
    In case you are interested: http://code.google.com/intl/de-DE/speed/page-speed/docs/extension.html
    Generally you should try to load images with as little resolution as possible without loosing display quality (must probably try it out) and then size the images to the desired size in your HTML/XHTML and CSS files.

If you have to set up a page with a lot of images the loading speed will always be bigger but you can do sth to make your page interesting while it is still loading.
That means, if visitors see an empty white page until all pics are loaded, you probably lost.
There is some possibilities to display interesting content (like background and menu) before the page has loaded entirely.
If you have questions about this topic, don’t hesitate to ask.

Hope that gave you some ideas,

It depends on the site in question but the answer would be “the weight page should be as small as possible”. You can’t really put a figure on it because it depends on what your target audience is and what the design entails. If your visitors are all on dialup or on mobiles then you need to keep the page weight very low.

If you have a graphic based gallery site then you will obviously need lots of good quality pictures. Of course they should be optimised as much as possible. I see some people using a 200k image as the background of a site and that just slows everything down to a crawl. I get worried when an image starts to get larger than 20k.

Also make sure that the content to code ratio is favourable as you don’t want more code than you have content.

Remember that “fast” is probably the best feature of a website.

These links may be of interest.



In terms of memory size there is no actual set maximum as different peoiple have different speed connections and have different expectations of how long a wait is acceptable (which may be affected by what they expect to find on the page).

30k total is considered to be the optimum size for the first page people visit on your site when they don’t know what to expect as everyone should wait long enough for that to load. Few sites built their pages that small though so you are then relying on something in the first part of the page catching your visitor’s interest so that they don’t leave before the page finishes loading (as a large number of people always do).

Long time ago, 70KB was the maximum for correct browsing with the average bandwidth sizes, but nowadays with many people on DSL this convention of 70KB is a bit obsolete I believe.

The maximum size you should use is the smallest size you need to use.

I know that isn’t very helpful, but there are no hard and fast rules.

Ideally, you want a total content transfer of max 100KB, but that can be quite limiting. If it goes up to 200KB, that’s still pretty lean. But once you start getting into the territory of over 500KB (eg sitepoint.com), you’ve got a fat page that has to be pretty darned good for people to wait for it to load.

The reason I ask is that I find that most of my non-content page weight is from images.

I designed a page today that had 4kb of html and 60kb of images and I was wondering if this was a good ratio. This is the first page and after these elements are cached the rest of the site would loads faster.

But maybe there is a way to only load a few of the graphic elements on the first page (with a graceful fallback) and then load the rest on the following pages. That was the first page would not be the slowest (when the user is deciding if they want to visit the site) and with the highest bounce rate.

Could you provide a link so we could have a look ?
As I said before, if you need a lot of images

  • I built a website for an artist recently and therefore know what you are talking about -
    you can use certain technics to keep the page interesting while loading.
    As long as visitors don’t see an empty white page during the loading process, it will be okay for most users,
    I mean there is a reason why they visit this particular website and mostly they are curious about the images.
    Since I don’t want to be suspected to advertise a webpage I don’t want to post you a link
    but I recommend to build an interesting page around the images,
    meaning, if a menu is quickly visible and a header and some nice backgrounds, you shouldn’t have problems with visitors leaving because of impatience.
    I must admit though that I also limited the amount of images per gallery, i.e. per page, to make it acceptable.

Again: A link to your project would help to give better advices.


The problem with that approach is that you don’t know which of the pages will be a particular visitor’s first page. Many of the visitors to each page will be going there first, a smaller number will have already seen another page.

50-60k of images that are then reused on the other pages isn’t too bad. The sites that tend to lose visitors before the page loads have each image that big or bigger (which is completely unnecessary for the web).

@noRiddle - I develop these locally and cannot make them public yet. It’s nothing really special. If it becomes a real problem I can always store images on Amazon S3.

Most small business sites have so little content that without a good design it would look pretty dull. So it’s a challenge to find a good design that also loads quickly.

I’m glad there is so much research going on to increase speed even as the number of scripts that you can install seems to explode.

It’s always looking at whether you can optimise your images further. Are they in the most appropriate form (ie GIF, PNG or JPG)? If they are JPG, can you reduce the quality any further? Bear in mind that a photo on a website rarely needs to be above about 75-80% quality - higher quality than that won’t generally be noticeable, so the larger file size is just wasted.

Also, save JPEGs as progressive, so that (most) browsers will gradually display them while they’re loading.

Also, save JPEGs as progressive, so that (most) browsers will gradually display them while they’re loading.

I know when doing this with the lossless types like gif (interlaced), your filesize in total becomes larger. Does this also happen with jpg’s?

Also not sure what Firefox would do with a half-loaded image: when there is no image, it remains an inline element and affects layout as an inline. After an image is downloaded, it gets to have dimensions like a block. The other browsers don’t have this issue because they honour the stated dimensions of an image whether it loads or not.

A test on an 800x600 photo saved with Photoshop’s Save for Web & Devices at quality 70, with and without optimization shows:

Standard 120.7 kb
Progressive 119.9 kb
Standard/Optimized 120.2 kb
Progressive/Optimized 119.9 kb

Suggesting that, with Photoshop at least, progressive can actually be smaller, and optimizing has no benefit for progressive.