Efficient selectors

What are your thoughts on efficient selectors?

For instance in a main navigation with sub navigation you might have 50+ a tags. Well, if you are using Google’s Page Speed extension you will get a warning letting you know that unless you have given those a tags a class name that that is an inefficient use of selectors.

But to give each of those a tags a class name just so the css engine doesn’t have to look at all of the a tags in the entire document seems to kind of take away some of benefit of using css in general.

What are your thoughts?


If you want to generally select anchors in a list of 50+ items (all with anchors) and it’s telling you to add a class, that is stupid IMHO. CSS was meant to cascade down hte document easily picking out elements and styling them however you want. If some program tells you it’s inefficient to not add a class (you should if you want a particular one styled uniquely though) then ignore it :slight_smile:

OK, cool I totally agree. Just wanted to make sure my thinking wasn’t too off base.

Have a look here:


For most sites it will make no noticeable difference. For really large and speed dependent sites then some gains can be made but usually time spent in other areas would bring better performance results.

Indeed. It’s a matter of milliseconds.

If you want to optimize your site, start by profiling it so you know what the bottleneck is.

Do you have any specific tips on doing this Simon?

I have yslow installed in Firefox which has a lot of useful tools (if only I knew what they meant) :slight_smile:

yslow seems to be based around a set of checkpoints, not about finding bottlenecks.

Check out the “resources” tab in Safari’s Web Inspector; it shows loading times for things on the page. The “profiles” tab shows where most time is spent in your javascript, but keep in mind that optimizing javascript also might not be noticeable.

Also investigate profiling server-side; it’s possible that adding a server-side cache for dynamic pages can significantly improve page performance.

Gzip and client-side caching are effective at reducing bandwidth. Intelligent preloading (i.e. loading in the background what the user is likely to want to load next) is good for perceived performance.

Some ads might block page loading until the ad has loaded, which is a real performance problem especially since ads are usually from a different server and thus require a DNS lookup. If possible place your ads after the content in the markup or try to make them load async or after the rest of the page has loaded.

Thanks for the insight Simon. Some good tips there.

I did purchase the Steve Souders book I linked to above but most of it wasn’t relevant to me but very interesting all the same. He did have a good fix for turning some IE expressions into a run once routine to speed up the ie6 min and max width routines.