Is 30 Javascript Files Too Much?

Am I the only one getting frustrated by the ridiculous overuse of Javascript in web pages these days?

I wanted to check the temperature in my local area. So I browse to the website of my local television news station. The website loads slowly. It’s like watching paint dry. It’s reminiscent of being on dial-up. So I check the source and count over 30 javascript files being included.

What happened to usability? What happened to the emphasis on speed and not making users wait over 15 seconds to view a web page?

What kind of developer thinks it is a good idea to have that many javascript files included in a web page? The number of Javascript files I counted was 32 including analytics scripts. I can’t imagine how long it would take to do all that on a slow connection.

You can check it out for yourself:

Also check out this website and see if it loads slowly:

When I view that second website, if I disable Javascript it loads pretty quickly. With Javascript enabled, it’s pretty slow. Is it slow for you, too?

It’s getting to the point where browsing the web is no longer enjoyable. At least it is for me. Anyone else think there’s too much JS in pages these days? I can’t be the only one.

No, you’re absolutely right. 30 requests for JS is far too much, and that site’s load performance is abysmal. The number of HTTP requests is the #1 most significant factor in page load performance, and KSTP makes more than 300 requests. That’s a ridonkulously large number.

It is a pretty big task to manage all this. Once you start thinking about modular, component-based applications the logical conclusion is components need to load their own chunks of stuff so that you can effectively manage dependencies. And requests start multiplying. 300 requests don’t hurt all that much in development as you are probably working locally, or even in your test setup which is likely on at least a very solid internet connection to the QA network. Then you go out in the wild and the fun begins.

There isn’t a great way to solve this though the new generation of web frameworks are taking a serious run – I can think of a few that are automatically combining and minifying JS files included with requests. The flip side to that is now every single page has it’s own special javascripts so you can’t ride caching either.

yes you are right 30 JS requests are too much. Its makes the site slow to work whether the JS works very fast. But I have a point in my mind that they this much JS pages because of the modern world’s internet speed. If you are using more than 10mb/per sec how such can be slow.

I think it’s a bit excessive but i’m a fan of modularization and organization, so better 30 then putting all in 1.

Then you risk that your visitors will never reach you. Not everyone uses broadband and even if they do, there may be overloaded servers on the way that may slow times even further. More than 30 secs waiting = I go and visit other site.

For broadband users is even less time. 10 secs and they’re gone.

Modularization has many advantages but there’s need to be a compromise at some point. I get stressed if I have more than 3 script files :stuck_out_tongue:

Anythings possible :slight_smile: but i’d be far more worried about Ads slowing things down then Javascript being separated into a few extra files. Also the 10 second rule isn’t necessarly true, it really depends on the site, I wait longer than that for one site because they’re always overloaded and drag along, but the content is worth it! But that’s just me :wink:

Ok. it was a generic comment and not everybody is the same. It would depend on the type of connection you have and its speed, and the content itself. But then, you do need to know that content is worth enough… if you’ve never seen it, you’d probably get tired and leave.

When you have a business, you don’t want to lose customers unnecessarily.

Modularization is great for the developer but not so great for the end user. At the end of the day, he’s the one that counts.

30 is way too many. We actually have a web property where I work for that has a similar number, and I’ve been on them to reduce them.

For my own stuff, I will have it modular and whatnot, but then have a script which will automatically minify and combine it into one. As wwb_99 said, you run the risk of then having every page with it’s own unique script, so there is no caching. To counter this, I generally have all of the general purpose stuff in one (which creates a file that is still relatively small), and then I load any large extra components just on the pages they are needed (which, if it is there at all, is usually just a page or two). It takes a lot of extra work, but I think it’s worth it.

Too many people forget that the number of HTTP requests is usually the largest cause of page slowdown.

Yep, this is something that people working on small scale sites fail to understand. Granted there are techniques for using compression on more dynamic site but it all depends on the set-up. After inspecting the files on the first site most of them have very little JavaScript. The creator probably just separated them out to make everything more modular. However, a script should probably exists that aggregates and perhaps even gzips the contents. But yeah…

Also, the fact that the entire contents of table elements have to be downloaded before the table itself can be displayed probably isn’t helping :slight_smile: