I would like to know if anyone digged into creating a cache system to improve the overall performance of a website. Something similar to this tutorial.
If so, are there some methods which are better than others to use a such system ?
Why… would you cache dynamic content? (If it’s not dynamic, why are you using PHP?)
The way how I saw it was to cache dynamic content with a time frame. For example, my site would have news being released… However, these news doesn’t happen each 2-3 seconds, so setting a time limit of ~5mins would allow the user to avoid refreshing the data each time he travels within the website (example: home -> about -> contact -> home).
Performance reasons. e.g on a high traffic site that requires a lot of complex queries to construct the home page, which might be enough to bring a hefty dedicated server to its knees except that page content is cached for a short time which drastically reduces the amount of cpu-expensive database calculations, by a factor of hundreds or even thousands. Pretty much all high traffic sites use caching in some form because the alternative is massively increased expenditure on server infrastructure. Very few sites need truly ‘up to the second’ currency in content.
There are quite a few methods of caching, and which you use is dependant on a number of criteria - content resources available versus generated demand in cpu, disk i/o, memory, and cache rebuild ‘cost’, also page structure can also dictate which to choose.
You can cache data to memory (fast but volatile) or disk (resilient but slow). At the most basic, you can just write a file to disk, then read this back until it passes an age threshold , then generate new data. You can also let mysql handle query caching, or cache compiled php opcodes (xcache,apc), or use a caching proxy in front of the entire web server stack (squid, varnish). If you don’t have the capability to install new software on your hosting most of these possibilities will be eliminated though.
Most PHP caching systems at the least retain the bytecode of the PHP files after they are compiled the first time so that they do not have to be repeatedly loaded from disk and parsed. Tasking on that level alone can speed a site up by a factor of 10.
So obvious retort: OP, how many visitors do you get to your site a day, and how long does your page take to load? If you’re not talking in the thousands, and more than a second’s processing time, i doubt there’s a NEED for caching.
The site is not done yet, we are still in the planification process mostly. A bunch of HTML pages has been done which I will have to convert.
It is reasonable to plan for a dozen of visitors daily due to multiple factors around the project such as alliances etc.
A site that has a dozen visitors a day does not need caching.
Unless you mean dozens of thousands?
Whoops, my bad. I meant dozens of thousands…
Time to go hide myself in a barrel ><
I let the server itself handle caching…where possible.
In other words, for semi-dynamic pages PHP does not need to be invoked.
That alone saves a trip.
^The same thing can be achieved with ngix and memcached. See http://blog.martinfjordvald.com/2011/02/implementing-full-page-caching-with-nginx-and-php/
It’s one of the best caching setups I’ve seen described so far