I’m currently working on a project where I need to manage and manipulate large JSON datasets using JavaScript. Given the size of the data, I’m concerned about performance issues and browser responsiveness.
Does anyone have recommendations on efficient ways to handle large JSON data? Specifically, I’m looking for:
Techniques for parsing and processing JSON without slowing down the user interface.
Any specific libraries or tools that could help manage heavy data more efficiently.
Examples of best practices or patterns you’ve successfully implemented in similar scenarios.
In desktop applications work that will likely make the UI unresponsive is done in the background; in another thread for example. Things are more complicated for web applications. Will this be done in the client or the server? Is it necessary to process all the data every time a page is processed for every user? An ideal solution would be to periodically process the data in the background in the server. And then …
Again, it depends on whether the data is being used in the client or the server. If it must be accessed in the client then perhaps converting the data to delimited files (CSV or tab delimited or something like that) might be more efficient. Probably you need to process the data in the server and then yes a database is likely the best choice.