Best Practices for Handling Large JSON Data in JavaScript

Hello everyone,

I’m currently working on a project where I need to manage and manipulate large JSON datasets using JavaScript. Given the size of the data, I’m concerned about performance issues and browser responsiveness.

Does anyone have recommendations on efficient ways to handle large JSON data? Specifically, I’m looking for:

  1. Techniques for parsing and processing JSON without slowing down the user interface.
  2. Any specific libraries or tools that could help manage heavy data more efficiently.
  3. Examples of best practices or patterns you’ve successfully implemented in similar scenarios.

Thank you!

This looks like an interesting thread on stackoverflow

Might be worth investigating some of the recommended libraries like D3 and Pyodide.

1 Like

first response: define large. human scale large and computer scale large are orders of magnitude apart.

2 Likes

Maybe there is a reason why big data is normally handled by databases and not text-files (JSON is nothing different the text-files)

In desktop applications work that will likely make the UI unresponsive is done in the background; in another thread for example. Things are more complicated for web applications. Will this be done in the client or the server? Is it necessary to process all the data every time a page is processed for every user? An ideal solution would be to periodically process the data in the background in the server. And then …

Again, it depends on whether the data is being used in the client or the server. If it must be accessed in the client then perhaps converting the data to delimited files (CSV or tab delimited or something like that) might be more efficient. Probably you need to process the data in the server and then yes a database is likely the best choice.

2 Likes