It’s been a long time since I have posted here, but couldn’t think of a better venue to post this.
Compared to file systems Databases are known to be significantly more effective in managing data, but I am trying to find examples where this is not the case.
I came across a post that addressed by conducting a series of very interesting tests.
Although as the writer states, there wasn’t a lot of science behind the environmental conditions of these tests, I still think they are valuable and are certainly worth considering.
Challenge: Let’s say you would have a page cached through a content delivery Network. All of the static contents of the page are being cached, images, JS scripts, CSS assets, and so on.
On the same page you have an area that is populated by a Json call that requires a call to an external content provider. This service pulls data from the provider, stores it in a local database, and serves it to the frontend once Json calls it.
The obvious advantage of the Content Delivery Network is that it’s serving static localized content, distributed across a global network, providing faster local access. The whole service aspect of this is creating significant latency, and for various reasons, this piece of data cannot be cached by the content delivery network.
Idea: Since apparently getting one single piece of data is lose to 10 times faster thought the file system rather than through the database, would it not be feasible to store the set of values in independent files in the file system (ex. shared between multiple SSD drives), that would hold the ID of the parent value in the name of the file, map/call them through the Query string, Session or Cookie? This would mean that instead of making a call to the service, the page would only include a value from a static file, that could even be cached.
There are a couple of ways we could generated these files. 1. They could be generated “per user request”. Each time the page is requested by one user, it would generate the file and store it in the SD drives. A second user requesting the same page would not experience the same level of latency, as the file would have already been generated. In addition to this, and to take this a step further, the Content Delivery Network could also cache this value. 2. Have a Job generate files for all dynamic pages and store them in SSD drives.
Question: Is it correct to assume that this would be faster than using LOCAL Memcache?
Your inputs are more than welcome!