I have the idea to put around 30 charts on a webpage, the charts will be load with data from a SQLite database and a node.js express webserver is sending the data and also getting the data every second from another program.
i think the node.js server will be ok to read and write every seconds this huge amounts of data, but i am not sure what is the best soltuion to update the charts in the webpage.
I have hear about things like Web Worker or Service Worker or Server Sent Event and i myself would use a setimeout function in the webpage which would read every second the data again and again and write it into the charts, but in my testing with just around 10 charts now i see already that the browser takes to much cpu and ram.
i am using apexcharts in the page and i am not sure if my settimeout function ad reading every second with that from the database is the problem or how other profis are doing it, what is the best technic to update such charts every second?
well one second data send through the web with just a timer isn’t going to really work as you see. You need a mechanism where if the data doesn’t come in time or isn’t received every second that your program will just continue on to the data in the next second. What you could try is something like a UDP connection which is what is used for video where if one frame of the video is not received, it forgets about it on the client side and just gets the next frame.
You could try playing with something like the following gist…
I haven’t tried it myself but it appears to setup a UDP server with node.js, sends data and then has code there for the client to receive and send back. For something so time critical and such, you need a ‘firehose’ type connection where again, if the client doesn’t receive the data one second then can go on.
Honestly, the disconnected nature of the web makes something so time critical hard to do in a reliable way. I think perhaps a UDP connection might be your best bet. That or you can simply not have so many charts on a single page and maybe break them up into various page sections. Do they really need all 30 at once and all being updated at once every second?
You do not need a socket connection. You just need to wait for the data request to be finished before starting a new one. Then you are fine with the timing.
But at the end try to think about what will happen to your server when 1000 or even more users open your page….
It is never a good idea to show more data to a user then he can work on. Or are you able to watch 30 diagrams at once and understand what’s in it? I guess no. So just offer the user what his mind can get.
Honestly I don’t think this is going to work either. The real problem the OP is going to have is requesting, waiting and parsing a response in a second. Sure you can wait for one to finish and start another but the problem is that this will still cause second to be missed. The fact you rely on an external source to get you all the info in a second in time to update before needing to make another request is not reliable and feasible. The best bet will be to have a more direct open connection where data can be exchanged faster. This is not even for the full 30, just to do a handful of charts would be the same issue.
If the OP is willing to let some seconds go from time to time, sure the above advice is good. Maybe every 5 seconds would be more reasonable.
Edit: Maybe if you can do in background threads with parallelism? That is another avenue to explore.
Exact, using a more direct connection will change from the attempt to reach light-speed with a biclecyle to reach it with a car. It will not help at all. That why I said he does not need it. If there is too much data it cannot be processed in 1 second dosen’t matter which way. So he needs to wait till the first data load is processed and then he can request the second load. How long it will ever take it will take. 1s or 1hour. It is like it is