Are modern websites using AJAX slower than traditional?

I’m noticing more and more websites where all the network requests are purely xhr. In other words if I open the browser web tools and watch the network tab, every single request is xhr.
I guess it’s the new cool way to build pages more modern, using purely API requests rather than loading static resources?

Anyway, the reason I notice this “trend” is because these very websites tend to have really slow page loads, but really small sizes.

One example is the login page of a particular service I use. It’s just a login page. The network tab shows TWO requests, both xhr, and total page size is all of less than 300 BYTES. It should be the fastest loading page in the world, I can load a 300byte page on the cheapest shared webhost in a fraction of a second. But for some reason this page takes 6 seconds to load! How does 300bytes take 6 seconds to load when it’s only two xhr requests?

In another example it’s a hundred times worse. This particular site was all xhr requests, but about 12 of them. Page size was also very low, measured in KBs. But it took 26 seconds to load! And this was testing multiple times, not just one random test.

I know that two examples doesn’t make a trend but it’s still alarming that a page of a few hundred bytes, or a few hundred kilobytes, should load so slow when it’s entirely ran over xhr and an API. This is supposed to be the “new” and “modern” and cool way to do stuff, over APIs and JSON and xhr and “living” websites and “reactive” features. But if getting all this awesomeness means I have to deal with websites loading like it’s 2001 on a 2g network, then forget it! Go back to PHP with a little jQuery ajax thrown in here and there.

Why should such tiny web pages load so slow when using primarily xhr requests? I hope this isn’t a trend!

Most likely something is up with your browser. There is NO WAY that that login page is coming in at only 300 bytes, these two last sentences alone take up 196 bytes (uncompressed of course)…

You’re likely looking at that login page with the assets cached, try hard refreshing (SHIFT+CTRL+R) and viewing the results.

You also probably have too many browser extensions. If you look in your consoles “page domain selector” (not sure what this thing is called) you’ll see all the additional sources that were loaded into the page. Notice how AdBlock is loaded into every frame on the page, not just literally the parent frame.

I definitely agree that many sites load slower than I would expect, but it’s not because AJAX somehow got slower. The sites traffic, how much the server has to do to generate the page, the location of the server and other factors do however have a huge impact on page loads.

2 Likes

Ya there has to be something going on.

I’m just hoping that moving toward 100% API-based websites is not going to go through a slow and buggy phase. I tried a different browser and still got 4.5 second load for 500KB size. This was mix of static and xhr resources.

There definitely sounds like something wrong there then, and I doubt it’s down to XHR requests. You should be getting a very speedy response from a page that size. What page are we talking about?

1 Like

One page I had trouble with today (which is not all xhr btw) is the bigcommerce login page.
https://login.bigcommerce.com/login

It feels like it should be faster, and they are a major provider so you’d think they have really good hosting and such. Just a random example.

Another slow site is ShipStation.

I could be completely wrong and baseless, but it’s like all these modern web reactive “living” services I use are just plain slow! My old PHP web forum works faster, but nothing is reactive! lol

This forum using Discourse is not very slow and it’s very reactive, so I don’t know.

I just tested it here. There were 115 requests first time round, and 78 on the refresh. It does seem like there are a fair few things going on there, but it does start to render content far faster than the overall time suggests.

This might be an interesting article to read, as it focuses on the order in which resources are loaded into browsers. You might find something relevant to what you’re seeing.

2 Likes

The waterfalls in tools like GTmetrix can be useful to study the load order, timings and spot hold-ups and bottle-necks.
https://gtmetrix.com/reports/login.bigcommerce.com/ejqvByOp

Well it’s not important for me to try and troubleshoot the websites of these services. I guess I’m just surprised with all the modern development knowledge we (collectively) know now, having any single page load by a major company that is so long is kind of shocking. 26 seconds!

I pretty much browse the web with ublock origin on all the time, not even cause I’m afraid of trackers or something, but just because it makes things tolerably faster!

One page I had trouble with today (which is not all xhr btw) is the bigcommerce login page.
https://login.bigcommerce.com/login1

This page is actually incredibly well optimized to the point where they have obviously considered Time to First Meaningful Paint.

I’m not sure what page you’re looking at, but the ShipStation main site is also optimized. It actually even defers font loading.

There is something wrong with your machine or network. Both of these sites have been optimized by people who know what they are doing.


The user shouldn’t care how data is given to them. Templating generally is done on the frontend now. There are some sites who abuse this, but for the most part, if it’s done correctly it should be faster and snappier for the user. Prerendering can give a small performance boost, but the added complexity of building a page that way is rarely worth it.

1 Like

Amazing. It’s so optimized it almost beat 10 second page load this morning!

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.