Making CSS more "effecient"?

Hi,

I’m using the Google PageSpeed tool to find out how to get out pages going quicker. We’ve got the pages up to 98/100, but I wanna get them even quicker :wink:

Its moaning about these CSS classes:

* #header .col-3 div    Tag key with 2 descendant selectors
* #header .link:hover span    Tag key with 2 descendant selectors
* .menu li a    Tag key with 2 descendant selectors
* .menu li a:hover    Tag key with 2 descendant selectors
* .column-3 span span    Tag key with 2 descendant selectors
* table tr td    Tag key with 2 descendant selectors
* .order_form tr td    Tag key with 2 descendant selectors
* #header_menu li a    Tag key with 2 descendant selectors
* #header_menu li a    Tag key with 2 descendant selectors
* #header_menu li a:hover    Tag key with 2 descendant selectors
* .cluetip-default #cluetip-title a    Tag key with 2 descendant selectors
* .demoMenu a:active span    Tag key with 2 descendant selectors
* .demoMenu a:focus span    Tag key with 2 descendant selectors
* .demoMenu a:hover span    Tag key with 2 descendant selectors
* .demoMenu .yahoo span    Tag key with 2 descendant selectors
* .demoMenu .facebook span    Tag key with 2 descendant selectors
* .demoMenu .twitter span    Tag key with 2 descendant selectors
* .demoMenu .digg span    Tag key with 2 descendant selectors
* .demoMenu .main_bookmarks span    Tag key with 2 descendant selectors
* .demoMenu .stumble span    Tag key with 2 descendant selectors
* .demoMenu .myspace span    Tag key with 2 descendant selectors
* .demoMenu .del span    Tag key with 2 descendant selectors
* .demoMenu .email span    Tag key with 2 descendant selectors
* .demoMenu .more span    Tag key with 2 descendant selectors

Can anyone suggest how to make this “go away” and potentially speed up the page loadtime?

The CSS classes are all built into a singe minified CSS file, so its not readable as-is.

If it helps, I can post a link to the non-minified CSS (just let me know :))

TIA!

Andy

The script is only ever run when we make changes to the files :slight_smile: We name the files whatever.html.orig, and then the script finds those pages - and compresses them, before saving them to whatever.html. Very little overhead :slight_smile:

Which often consume MORE bandwidth by pushing responseHeaders to two packets instead of one… Usually the browsers default caching period on static content is quite sufficient and those savings more myth than fact. Can also cost you on first-load servers side as the decrease in accesses results in commonly used static elements being purged from the pool more often… that one is REALLY a balancing act - and often a waste of time.

Yeah, I think we have it set to 1month (not the 1y that google recommended - which is madness!)

TIA

Andy

[ot]

I know the difference between echo and print in PHP. It still doesn’t matter which one you use. There are more important things to worry about. Any micro-optimizations you do will be invalidated the moment HTTP and network latency is involved. Making it absolutely pointless.[/ot]

The savings are indeed minimal but both have advantages, depending on what you want to do. See

http://www.faqts.com/knowledge_base/view.phtml/aid/1/fid/40

Yeah, that’s white-space stripping, and waste of time white-space stripping at that… and really slow waste of time white space stripping.

ultranerds sent me their URL via PM, and it’s being sent compressed by apache anyways, so that script is making it take MORE time, not less.

Looking at the code in question, it’s chock full knee deep with outdated markup. It is EXACTLY the type of code I’m referring to when I say don’t waste your time on all that scripted nonsense and white-space stripping, instead write MODERN MARKUP with separation of presentation from content.

It’s knee deep in deprecated tags, tables for layout, non-semantic layout, heading tag abuse, invalid heading orders, etc, etc, etc…

For example, the first heading on the page is a h5, followed by two H6, and then TWENTY h1’s… and again skipping over h2, h3, h4, right to another h5 and two h6. Invalid heading orders and nonsensical document structure.

To put it in the simplest terms it’s 27k of markup after white-space stripping, that has WAY too many keywords stuffed in the meta(12:1 the keywords meta is being ignored on account of that), has what appears to be about 10k of static javascript inlined in the markup, and with 239 validation errors it’s not even HTML, it’s 100% gibberish.

That 27k of html is a real hair-raiser if for no other reason than there’s only 6k of content, a dozen content images and one object embed. There’s NO excuse for that to be more than 15k WITH whitespace still in it.

CSS more efficient is the LEAST of your problems.

Lemme give you an example, take this section (carefully chose a piece that would not reveal the website itself)

<table border=0 width="100%"><tr><td width="50%"><font size=3 face="arial,helvetica,sansserif"><b>TRAVEL
 WINDOW: </b>&nbsp;</font><p><font face="arial,helvetica,sansserif"><b>From :</b> </font><font size=3 face="arial,helvetica,sansserif">APRIL 19 -10&nbsp;&nbsp;&nbsp;<br><b>To :</b>&nbsp;&nbsp;&nbsp;&nbsp; JUNE 30 - 10&nbsp;</font>&nbsp;</p><p><font size=3 color="red" face="arial,helvetica,sansserif"><b>CAT. OCEAN
 VIEW SUPERIOR &nbsp;</b></font>&nbsp;</p><ul><li><font size=3 face="arial,helvetica,sansserif"><b>DBL</b> $ 57,00 USD
 PER NIGHT</font></li><li><font size=3 face="arial,helvetica,sansserif"><b>SGL</b> $ 113,00 USD
 PER NIGHT</font></li></ul></td><td width="50%"><font size=3 face="arial,helvetica,sansserif"><wbr><b>TRAVEL WINDOW:</b> &nbsp;</font><p><font size=3 face="arial,helvetica,sansserif"><b>From :</b> JULY 01-10&nbsp;<br><b>To :</b>&nbsp;&nbsp; &nbsp;&nbsp; AUGUST 31- 10&nbsp;</font>&nbsp;</p><p><font size=3 color="red" face="arial,helvetica,sansserif"><b>CAT. OCEAN
 VIEW SUPERIOR &nbsp;</b></font>&nbsp;</p><ul><li><font size=3 face="arial,helvetica,sansserif">DBL $64,00 USD PER NIGHT</font></li><li><font size=3 face="arial,helvetica,sansserif">SGL $127,00 USD PER
 NIGHT</font></li></ul></td></tr></table>

Makes me immediately kneejerk “What is this, 1998?” See ALL those FONT declarations? GET RID OF THEM. That’s CSS’ job. See the paragraphs, you don’t actually seem to HAVE grammatical paragraphs of content there, so I’m not certain what you are even using those FOR…

That’s 1.2k even once whitespace stripped - Written “properly” that section would probably read:


<div class="travelBox">
	<h2>Travel Window</h2>
	<b>From:</b> April 19 - 10<br />
	<b>To:</b> June 30 - 10<br />
	<h3>CAT. OCEAN VIEW SUPERIOR</h3>
	<ul>
		<li><b>DBL</b> $ 57,00 USD</li>
		<li><b>SGL</b> $ 113,00 USD PER NIGHT</li>
	</ul>
</div>

<div class="travelBox">
	<h2>Travel Window</h2>
	<b>From:</b> July 01 - 10<br />
	<b>To:</b> August 31 - 10<br />
	<h3>CAT. OCEAN VIEW SUPERIOR</h3>
	<ul>
		<li><b>DBL</b> $ 64,00 USD</li>
		<li><b>SGL</b> $ 127,00 USD PER NIGHT</li>
	</ul>
</div>

Which is around 500 bytes, a reduction of more than HALF. Mind you that’ s a wild guess since I’m not 100% certain the “cat ocean view superior” should be a h3 since to me that’s gibberish… or should be the h2 before the “travel window” text. In general that section is very poorly written not just in code, but content as well.

But content issues aside, see what I mean? HALF the code. EVERYTHING else you are doing there could be handled from the CSS. CSS is cached, HTML is not.

If anything, I’d say you don’t HAVE enough CSS to be worrying about it… you’ve got WAY too much static stuff that SHOULD be in external files like .css and .js in the markup - that way you actually take advantage of the caching models.

Though the REAL pig is that steaming pile of javascript just to embed a flash element, and the time wasted on google analytics. What, don’t you have server logs? :wink:

Oh, and another tip, if you are resorting to things like ‘blank.gif’ - you’re still coding using decade out of date methods.

Again, the CSS is the least of your problems there.

Hi,

AtSea - thanks, will give that a go :slight_smile:

deathshadow60 - totally agree with you - but I’m determined to get it faster. We already have mod_gzip, compressed HTML pages (made with a .cgi script), compressed CSS/JS files - again, made with a .cgi script … and have setup expireheaders, CSS sprites etc. All is working well (its taken the page load time from 4+ seconds to about 1.5 seconds now)

Cheers

Andy

Hi,

Its very simple to write a script to compress it using HTML::Clean:

#!/usr/bin/perl

  use HTML::Clean;

  my %options;
  $options{whitespace} = 1;
  $options{comments} = 1;

   my $ssi_folder = "/home/user/domain.com/www";

   my @files;
   opendir(my $dh, $ssi_folder) || die "can't opendir $ssi_folder: $!";
	    while(my $filename = readdir $dh) {
			if ($filename =~ /^\\./ || $filename =~ /^minified_/) { next; } 
			if ($filename =~ /\\.orig$/) { 

				my $html = undef;

				my $write_file = $filename;
				   $write_file =~ s/\\.orig$//;	

				 my $h = new HTML::Clean("$ssi_folder/$filename");


				 $h->compat();
				 $h->strip(\\%options);
				 my $data = $h->data();

				open(WRITEIT, ">$ssi_folder/$write_file") || die "Cant write $ssi_folder/$write_file. Reason: $!";
				  print WRITEIT  $$data;;
				close(WRITEIT);
				
				print "Done $ssi_folder/$write_file ... \
";
			}

		}

   closedir $dh;

The above code is ONLY run when we make a change to a .orig file (manaually, via SSH) - so there is no overhead in terms of what the user sees from load time etc

Regarding the link - I’ll send you a PM (afraid I can’t really post it online)

white-space stripping and not using REAL gzip/mod_deflate compression?!?

I believe the host has it setup with mod_gzip.

Cheers

Andy

Which means you are just white-space stripping and not using REAL gzip/mod_deflate compression?!?

Trying to figure out how a ‘cgi’ would be more efficient or even do anything MEANINGFUL at all… or how you could even GET a CGI to do it without calling gzip and sending the proper response_headers on every call… Even if you saved the raw gzip you’d still have the overhead of checking for change, the second file access, sending the new headers…

… at which point, just let the server deal with it using mod_deflate. I think you’ve gone off to noodle-doodle land on that one unless I’m COMPLETELY missing your meaning, as the server load of what you are describing CANNOT be saving you anything meaningful since white-space stripping rarely pulls more than 3 bytes per LINE of code, and that’s the only thing you could do without gzip type compression and header overheads - making it a total waste of time to write code to handle instead of adding one or two lines to your .htaccess. (assuming you’re hosted on a REAL server and not some winblows crap)

I’d have to see the actual page in question, but really google’s tool tends to ***** about crap in the CSS that would chew more bandwidth and make the page SLOWER since you’d have to put more classes in the HTML.

Just part of why I don’t trust automatic ‘tools’ for much of anything. They’re a crutch that often cause more harm than good, or cause undue alarm over non-issues.

But again, no site/code, no help for you and anything we say would be a wild guess in the dark.

Take the fix atSea suggested - it would be fine unless you also happen to have “Content .col-3 div” defined somewhere (like say in another stylesheet) or even a separate “.col-3 div” that gets styled different from the ones in #header - in which case removing the ID could bork the whole page… You could also be using the ID to resolve specificity, in which case it’s removal would necessitate the use of !important, which is just as slow.

… and I guarantee that’s something their ‘tool’ doesn’t take into account. Said tool advocating **** I would never do to my code like that ‘minifying’ crap. white-space stripping is NOT the answer people!… go ahead, shtup maintaining the site… ANYONE tells you to white-space strip to save bandwidth, they are probably too lazy to fix what’s REALLY wrong with a site and are the LAST people you should take optimization advice from. (especially since if you serve gzipped/mod_deflate, who gives a ****)

Taking a look at a couple of my sites in it, and the **** they’re *****ing about in the “efficient CSS” crap would break half the pages on my site if I reduced them either due to specificity or the re-use of the same class in different sections…

Much less it not taking into account information contained in the CSS might be used by SUBPAGES and is present to pre-cache it speeding subsequent pageloads; they act like it takes longer than a separate file request (Bull) or their image optimizer saying “losslessly” when it does horrific damage to most images (like stripping palette transparency, breaking the image format in IE6/earlier, etc, etc) and claims 2-3K of savings off 800 byte images saving them as 4K+ - NOT exactly blowing my skirt up guys.

Post up a URL, and I’ll give you a much more meaningful speed analysis - assuming you can take actual criticism since I don’t slap the rose coloured glasses on your head to sugar coat it. I will tell you EXACTLY what I see ‘wrong’ and tell you how to fix it.

For writing “effecient” CSS…don’t bother, honestly. You will save like 2 milliseconds (1/1000 of a second).

[ot]

It doesn’t matter, both do the same thing. Use whatever.
Besides the OP is using PERL, not PHP.[/ot]

I won’t go into a long winded detail as to what it is but you can see this page for review:

http://css.maxdesign.com.au/selectutorial/selectors_descendant.htm

The simple quick fix is to remove the leading items as they become redundant.

#header .col-3 div becomes .col-3 div

those are static content, why would you be wasting server overhead on using CGI for that instead of just setting apache to do it automatically? (or even on the fly via .htaccess?)

Which often consume MORE bandwidth by pushing responseHeaders to two packets instead of one… Usually the browsers default caching period on static content is quite sufficient and those savings more myth than fact. Can also cost you on first-load servers side as the decrease in accesses results in commonly used static elements being purged from the pool more often… that one is REALLY a balancing act - and often a waste of time.

Again though, if you’re willing to share a URL, we could probably help you push it a LOT further than any alleged savings that faulty ‘tool’ could ever give you.

do what Shadow suggested. I do all of that in my .htaccess file.

Can’t you check the header info of the page in Firebug? I always thought that’s how you check to see if gzip is working as intended.

BTW avoid print in your php. Use echo.

http://www.learnphponline.com/php-basics/php-echo-vs-print