Is now the time to drop JavaScript and just use AJAX

The past couple of years has seen drastic web performance improvements mostly due to:

  1. virtually all website hosting have SSD

  2. website hosting now have more memory

  3. internet connections now much faster

Reasons for dropping JavaScript…

…mostly because I have quite a complicated PHP update routine that:

  1. downloads three XLS Spreadsheets from another host

  2. converts from XLS to CSV format

  3. deletes six old MySQLi database tables

  4. updates the relevant tables from the CSV files

  5. removes duplicates and empty columns

  6. creates indexii to improve search performance

The above operations takes 16.19 seconds on a five dollar VPS with 1 GB of memory and 66.94 seconds on my local computer using 8 GB memory and a 100 mByte internet service provider. Both OS are using Ubuntu 18.04.

Surely with such a vast elapsed time difference…

…enormous JavaScript libraries could be eliminated and just use AJAX to to update web pages?


Another reason is that I use AJAX to search tables with about 20,000 records and the response is nearly instant.

Edit again:

Corrected drive acronym from SDD to SSD.


Not sure I completely understand the question.

Well, Ajax is really just JavaScript. At least the asynchronous JavaScript part on the client is. Applications might use XML to send data back, but it is probably more common to use plain text or JSON. So, dropping JavaScript but using Ajax seems contradictory to me.

If on the other hand you are on about doing the heavy lifting on the server, rendering some HTML and then returning that to drop into the page (as opposed to doing much DOM manipulation on the client), then yeah, that makes sense.

Which libraries?


Good question. The equivalent of ajax can now be performed using fetch, which is native to JavaScript. No libraries at all.

1 Like

Right. So the question should be “Does it make sense to use jQuery to perform Ajax requests?”, to which the answer is “probably not” for the reason you just stated.

But I’m guessing. I’m sure John will shed some light before long :slight_smile:

1 Like

A bit off-topic perhaps, but does that mean fetch is not Ajax?

1 Like

XMLHttpRequest and and fetch are both Ajax, bu typically it’s the former that most people only know about as it’s been around for so much longer.


The point of Ajax is to update parts of a web page, without reloading the whole page. You can do that with XMLHttpRequest, or you can do it with fetch(). So, to answer the question, you can use fetch() to perform an Ajax request in the same way that you can use XMLHttpRequest.


Gotcha. It was the phrase “The equivalent of ajax” that made me wonder.

1 Like

My knowledge of JavaScript is limited. As an example of large libraries I was thinking this forum is an example of slow loading for virtually all text, adverts should load after the page has rendered. Why does it take about thirty seconds for the page to load?

I’ve just reloaded this page with the cache disabled, and the Network panel says that it takes 15 seconds. It takes just 5 seconds with the cache. I don’t have the fastest internet speed either as I’m on the end of a far-away wifi connection, giving me the stunning speeds according to of download 8 Mbps and upload of 0.82 Mbps. Nearly into the single digits! Yeah!

1 Like

So for me, the problem is likely to be connected with my slow upload speed. There are other things that can be checked out locally to find out where the bottleneck is, but based on my use of this website from other faster locations, I doubt that there’s much that can be done locally that would help with the thirty seconds loading time.

1 Like

I just tried another site that takes about thirty seconds to refresh even though it was previously loaded and displaying in the browser:"duration":("groupValue":"P1D"))

An API # Key is required for the above site, once loaded changing time frames immediately displays new graphs and many other selections are instant.

I get the impression that far too much is being loaded on the off-chance the user’s may request the option. I try to minimise the page size and use AJAX to pull data and insert into the rendered page.

In other words, you’re employing JIT loading, which is fine unless the AJAX request stalls for any appreciable length of time.

If the page takes 10 seconds to render (random number pulled out of the aether), does it make a difference if that 10 seconds is within the first 10 seconds of page load, or the second? Probably not.

If you have data that’s coming from multiple sources, one of which is unreliable, then it can certainly be helpful to asynchronise the pulls.
If you’ve got more data than you display on the page at any one time, then AJAX loading (effectively, paginating the entire page) is great.

For a ‘static’ page, you’ve still got to move X data which requires Y time to process. AJAX doesnt fix that. Javascript doesnt fix that. Improving the page, certainly does.

1 Like

This is the trade off between an SPA (single-page application) and your traditional server-rendered app. With many SPAs, you are effectively downloading the whole app on your first visit. This is obviously coupled with a performance hit, but is balanced out by the fact that everything is then snappier and more responsive as you navigate around during the rest of your visit.

There are various tricks you can employ to make the initial hit as painless as possible, such as code splitting, conditional/lazy loading etc. The important metric in Lighthouse is “Time to interactive”, which describes how long it takes until a user can interact with your app in some way shape or form.

One of the bigger downsides to the SPA approach is SEO. As JavaScript is inserting your content into the page (that was probably pulled from an API or somewhere) it is not as accessible to the Googlebot. To counteract this you can do serverside rendering, which sees the HTML of the app generated on the server and “re-hydrated” in the browser by your framework of choice.

I read this article recently which looks at the trade-offs of this approach.

My favorite line from the article is:

All of the fancy optimizations are trying to get you closer to the performance you would’ve gotten if you just hadn’t used so much technology.

IMO, it’s all about knowing what you are building and who you are building it for and basing your stack on that.


Looking forward to experimenting with fetch(…), just need to clear my bench…

Interesting and well worth trying on existing pages. When my bench is cleared I will give it a try.

A very good article.

Mine favourite quote is:

Ironically, backends are churning through technology even faster than frontends, which have been loyal to one programming language for decades.

I have found it is now impossible to keep abreast of technology. Options are increasing and in the ‘good old days’ choice was limited to Visual Basic 5.0 vs. Delphi 3.0

It seems my computer knowledge is rapidly decreasing on a daily basis :frowning: