HTTP Requests/Loading Time

Hey everyone.

I was thinking…I don’t know much about HTTP requests, but they seem to slow down the loading times a bit. I assume it’s when the browser encounters something on the page i.e. an image, etc. and then has to ask the server for that packet.

So then is it a good idea to, at the end, put in all of you JavaScript and CSS into a single HTML page to eliminate those requests?

~TehYoyo

No. You do that, CSS/Javascript won’t cache.

But will the HTTP requests (am I even interpreting that word right) take less time?

~TehYoyo

No it won’t take less time. You’ll have to download a larger file.

@TehYoyo.

So then is it a good idea to, at the end, put in all of you JavaScript and CSS into a single HTML page to eliminate those requests?

Experiment with http://tools.pingdom.com/ Tools or one of the many other sites that show graphical total time taken and the time for each individual HTTP Request, etc

Maybe try a cached landing page with mininal external files.

Do you really need to load all the external files?

TehYoyo, the answer to your question is yes, if you put all your JS and CSS into the HTML itself, then there will be fewer HTTP requests, and your page will load faster. However, you’d be missing an opportunity for repeat visits, because none of the JS or CSS can be cached. What you can do instead is combine all your scripts into a single JS file, and all your styles in a single CSS file. That way you minimize requests and can still use browser cache.

It would be an absolute nightmare to change the CSS side-wide then. Plus as mentioned before, it won’t cache.

I mean, combining it would be an action that I do right before uploading. I’d have a real copy on my home comp.

Does caching really matter much? I mean, I imagine that the file would only be about 100kb max - and that would be a huge html file…really?

~TehYoyo

What if you made changes at a later time when you have 100 plus pages? Good luck with that :slight_smile:

And for those with slow connections, it makes all the difference in the world.

From apparently knowledgeable posters I was led to believe that it is better to load a separate CSS file. After performing numerous tests with sites that actually show the total page load times and the individual file load times I now firmly believe that it is not easy to make assumptions or recommendations.

Every page is different and the majority of time is taken with actually waiting for the site to respond. Some files although small in size take far longer time to load than larger files. I think it is far better to examine the individual page loading results and to experiment.

Check out this post and especially the source page. Both cases the CSS is included rather than an external file:

Also note the source file for Google.com, once again the CSS is included.

I appreciate that both examples are landing pages which may not require a separate CSS (which is only helpful for the next page view). If the landing page does not show rapidly then there is less chance for a potential customer to view more pages from the same site.

Food for thought :slight_smile:

it’s a better than that is turn on Mysql cache, PHP cache and turn also on APACHE cache .

If you meant one file for CSS per media target, and one javascript file, and one HTML file, then yes, that would be faster than say… four CSS files, four javascript files… since that would be six less requests. If you mean dumping EVERYTHING into a single HTML file… BAD IDEA.

It’s why having a page that calls twenty separate script files can take anywhere from 4 to 40 extra seconds depending on the ping-time to the server… really though that’s an extreme example.

BUT – keeping certain things separate from your html file – like CSS and Scripts, means that sub-pages will load faster becuase those are cached. If you use the same CSS on thirty different pages, and a user visits all those pages, the single separate CSS file gets called only once… if you put it all in the html, they’d have to redownload all of it on every sub-page.

To that end making your first page load take a wee bit longer CAN be advantageous; a great example is a forums where the average visitor hits at least five pages. If you had 10k of markup per page average and 40k of CSS covering all the different things per page, they visit five pages that 250k as ‘single html files’ but only 90k if you put the CSS external and it’s cached. Not only that, those sub-pages will load and render faster because the CSS for them is already there – which is why separating out CSS that’s only used on sub-pages can often cost you bandwidth instead of saving it; and make the page seem to load slower. If you pre-cache it in a single CSS file on the first page they visit, it’ll be picked up by all the other pages they visit on your site.

Notice also I say “per media target” – Screen is just ONE thing you should target with CSS, and putting say… your PRINT or HANDHELD, or your media queries for modern mobiles inline in the markup would be a waste of bandwidth for the people who don’t use them!

There’s a reason I go media=“screen,projection,tv” on my LINK to the screen stylesheet.

Really the biggest place you can save on handshakes apart from the “dont’ call dozens of separate script files” is on icons and other small images on a page, and any presentational images you use. If you can combine them down to a single file, that’s where real benefits occur – see the inaccurately named “CSS Sprites” for where that comes into play.

Even just simple sliding-doors menu states:

Using one image to do what more conventional methods would take six separate files.

It’s why before we started doing CSS3 stuff for rounded corners and other corner/shadow effects I came up with Eight Corners under one roof – where most systems would use eight or nine images, I use just one.

Bottom line – as everyone else here has said, dumping it all into every HTML file is just wasting bandwidth; user agents can handle up to 8 requests at once, and a good rule of thumb is that anything less than 16 ‘requests’ total for a page template (not counting content images/objects) is entirely acceptable thanks to that overlap of requests. It’s only really a concern when you’re talking completely noodle doodle numbers of files like 30 or more.

Like anything else, it’s a balancing act – in this case cache vs. requests. It’s also a hefty part of why practicing separation of presentation (css) from content (html) reaps real benefits – the more ‘like’ appearance you can move into the CSS file, the smaller all your HTML files are and the less bandwidth you use; and the faster all your sub-pages will appear to the user after their first visit.

Reminds me a bit of the whole “embed small images in the CSS” thing – sounded like a good idea, one less request and all, but by the time you uuencoded the images to put them in the CSS, their increase in size would offset any gains you’d have from losing the handshake unless you’re talking >500ms ping time… which happens, but why would you penalize the people with fast connections?

The flaw is, you’re thinking a single self contained page – what if the visitor goes to another page that shares the SAME CSS? Cache kicks in - which is where the separate external file wins… and don’t discount pre-caching. From the day js was introduced people set up image precache for hovers (see that mm_ scripted idiocy from Macromedia/Adobe) … precaching sub-page appearances and sharing like elements means all your sub-pages load faster for visitors.

You know, the defense people use for jquery :smiley:

Though really that’s the OTHER advantage to an external CSS file – not only is it shared across pages for caching reasons, it can also be used to share appearance across pages keeping the HTML to it’s semantic minimums. (or not so semantic and not so minimal if still doing 3.2 as tranny or 5) making reskinning an entire site easier… often without touching the HTML.

I’m using two external CSS files - one “global” CSS file that is meant to cache across all pages that contains the Navbar, the footer, the header, basic font choices and sizes and universal things like that.

The other is page-specific. Should I combine all eight pages of individual, page-specific CSS into a single “global” CSS page that caches?

Keep in mind that the average user leaves after (is it 4 seconds?). If it takes a super long time to load (it shouldn’t - just text, right?), then they leave and it’s a bounce and all that caching is wasted.

Thoughts?
~TehYoyo

Edit: DS - Eight is a lot of requests…can’t imagine making more than 4 or 5 on a single page…and that’s a lot.

For me the question would be how big are they. If your total CSS is less than 32k, I’d go ahead and put them in a single file; that’s 4 seconds of transfer at DIALUP speeds; or around a quarter of a second at 768kbps DSL.

You need more than 32k for anything less than a full-on forums (and even then I’d only go to 64k max for ALL pages) there’s probably something wrong with the layout.

… and I’m talking uncompressed – what is it after the server gzip’s it?

If the bounce rate is high enough for that to be an issue, there’s something wrong with the page. If you can’t get a visitor to at least drill down to a sub-page, there’s something wrong.

Are you counting your images in that? Remember, ANY external file counts as a handshake. The average for most well written websites is around 12 to 15… so the average for most of the Internet is around 50. :smiley:

Take the current sitepoint forums index – it’s 33 files totalling 655k (198k once compressed by the server… the poor overburdened server forced to compress 426k of javascript)… and fifteen of that 33 file count is JUST scripting… again, 426k uncompressed, 140k compressed – so half the handshakes and 3/4ths the bandwidth – all to deliver 6.4k of plaintext. :frowning:

… and the sad part is, as vBull installs go, sitepoint’s one of the LEANER ones.

But take your average wordpress install once people add a bunch of plugins and a fancy skin, and you’re looking at 6 to two dozen scripting files, a dozen or more presentational images, multiple stylesheets for nothing, etc, etc…

Well, I just started yesterday, so… :slight_smile:

Are you counting your images in that? Remember, ANY external file counts as a handshake. The average for most well written websites is around 12 to 15… so the average for most of the Internet is around 50. :smiley:

I imagine that w/ images I’ll have about 2 CSS, 2 (maybe) JS and 4 or so images (plus an added favicon).

… and the sad part is, as vBull installs go, sitepoint’s one of the LEANER ones.

The price that you pay for using one of two well-known forum softwares out there that are reliable. And most importantly, premade. I imagine making a forum would be a living devil’s home w/ all of the scripting, etc.

But take your average wordpress install once people add a bunch of plugins and a fancy skin, and you’re looking at 6 to two dozen scripting files, a dozen or more presentational images, multiple stylesheets for nothing, etc, etc…

It’s that stupid “I can do it on Photoshop” mentality. :smiley:

~TehYoyo

People just don’t see the work needed to code their own forum script when they can get by with vbulletin. The amount of coding in it is amazing. Granted, it’s not the best, and as we’ve seen the code isn’t all that great…but it does the job, and it has a good reputation as a great forum software script.

I’ve attempted coding a forum before and I gave up after a week, hardly had anything done. There’s just simply too much to it, and all the features that vulletin has (along with addon scripts) just makes it not even worth it to reinvent the wheel, unless you plan on somehow marketing it, which then you have to ask yourself if you could make a profit after all the time and effort you put into it :).

Right. Why make spend the effort when you can just use a premade tool that works adequately?

I will stick to monitoring the loading times of the site items and optimise where required.

I still like Google’s approach of keeping the CSS file to a minimum (definitely < 4k), including in the main file and eliminating a precious HTTP_REQUEST.

Biggest problem I am finding at the moment is the server waiting time which makes a mockery of the total time to download an item.

Please supply the URL so I can test with Tools.Pingdom.com