I set up my test server last week at Bluehost and after reading up on gzip compression via .htaccess files I learned that while Bluehost allows custom .htaccess files, they have not enabled gzip compression.
Since they allow custom php.ini files I decided to enable zlib compression by changing the following settings in the .ini file:
zlib.output_compression = On
zlib.output_compression_level = 6
zlib.output_handler = no value
(both local and master values equal).
This does not enable compression on .css or .js files, however, unless you rename them .css.php or .js.php.
Is this the best way to compress CSS and JS files without gzip? Do I simply change the <link> tags to point to the .css.php or .js.php? Don’t I have to indicate somewhere that the .css.php should be read as CSS content?
The first thing I’d ask is whether the gzip module is available. If not, you’re SOL (something out of luck) - OR you need to create your files using compression and upload them (without the .php extensions!). Personally, I believe that’s your best option and would likely be my preference regardless of whether the server is able to compress or not because of the overhead of compressing a file repeatedly.
Bluehost does not have the gzip module available. They say their alternative is to enable zlib compression via a customized php.ini file the user can edit.
I’ll concede to your expertise, but I find it strange that manually compressing your files is more efficient than using the server’s zlib auto compression ability. Yeah, more overhead for the server, but if we are talking about a small site/blog is it really worth the extra effort of doing it manually?
Granted, I have never manually compressed and uploaded files to a web server for the purpose of being used as a website. Is there a tutorial you can recommend that can show me the steps? Maybe it’s not as bad as I am assuming…
I’m not sure what they’re doing with their php.ini file other than processing all php scripts with a compression engine which will then require … what? Decompression via PHP by the visitor? PHP’s a server-side output engine so it would have to mimic the server’s compression which should not require the php extension on css/js files. Okay, I don’t understand what they’re trying to do.
I have a client who insisted (with a former website layout) on adding content to the Home Page until it was at least a half dozen ‘pages’ long (in portrait format!). I “beat” on him repeatedly about that but, as his content was donated to him for posting, he HAD to include them on the Home Page. To get around having to create this huge page each time the page was loaded, I cached the content of the Home Page every time a new article was updated. I could just as easily have compressed it at that time but …
IMHO, doing something repeatedly and expecting different results is a sign of insanity, ergo, my approach above. HOWEVER, if your blog is expecting updates (page or view count, if nothing else), then “live compression” may be the better way to go (I’d still cache/compress around the live parts for speed).
As for manual compression, as above, that’s not necessary. My cache file was updated every time a new article was added or an article changed which appeared on the Home Page. As you’d only mentioned that your site is small, it seemed a reasonable option to compress manually (rather than “on the fly” or “on change”). It’s all in the wrist … er, design of your website!
Sorry, no reference available as I “look-up live.”
Okay, that was really to explain my rationale for prior comments which, if taken out of context of the limited information provided, may easily be adjudged inapplicable to your situation.
It’s just that it seems most web hosting companies do not make the gzip module available (Bluehost/FD/HostMonster, HostGator, are two examples) and my first impression is that it makes developing for a large site counter-productive since stricter limits are placed on your design than if gzip was available. My limited experience tells me compression is a must the larger your site gets.
Am I making a big deal out of something that is really not that critical because I let YSlow’s report on my site’s compression light a fire under my butt?
In the end, I want to adopt proper practices in making my websites as quick as possible by proper design principles, but also utilizing any server resources available that complement can this.
Actually, I’m sure you could use PHP to compress but I’m just a little less sure that you’d need a gzip module (PHP or Apache) to do so.
I think it’s more of a statement about those companies that they will not provide a common Apache tool.
Unfortunately, I do believe so. I’d look at the reduction in size of the file sent vs the additional time to compress and decompress. To me, you’d need to be sending LARGE blocks of text to have compression make any difference.
Ditto caching - I was dealing with a ridiculously long Home Page with that client and needed to cut down on the time spent to build that page (from a db). As updates were generally not more than one a day, it made sense for me to have the cache updated upon completion of the client’s update (I build backends to allow my clients to maintain their websites).
As I indicated above, it’s a matter of assessing your situation to determine (a) whether compression would be an advantage or, if you’re like me, (b) caching would be an advantage.
IMHO, unless you’re sending a LOT of text (images are already pretty well compressed) OR are “working over” (ABUSING) a database to assemble a page, then you ARE making a mountain out of a mole hill. However, I think this has been a useful discussion and hope you got something out of it.
Yeah, I figured that was going to be the response I received. It just bugged me I couldn’t use gzip w/ Bluehost and I got ants in my pants, however, the site is small enough that I should choose a smaller hammer for this nail.
BlueHost actually have intelligent GZIP compression turned on for all sites.
Where compressing the files would make it faster to deliver the pages the files will be compressed. If the server is overloaded and waiting for the resources to compress the files would slow the loading of the pages then the files will be delivered without compression.
At least that’s my understanding of how it works from the discussions I haveseen about this at the BlueHost forums.