Are there any ways you can do this concentration of file locally? It’s not on an online environment. Also are there any ways to safely minify your script without it breaking. From experience when you minify your JS it ends up breaking.
In addition to what you and others have listed, I’d also take the backend into account. It’s equally important in terms performance. Whatever language, web framework or CMS you’re using, there are always ways to improve performance and trim the fat. Some CMSes are hopelessly bloated, so your options are somewhat limited and dependent on the developers who created the CMS/Web Framework.
Still, there’s quite a lot you can do, so I’d scan the web and look for bottlenecks and how to improve on them.
Same goes for databases, what kind of database you’re using, whether or not you’re applying extra layers. Depending on what you use, perhaps there are options to implement other ways to store your data.
His books are great, too, and I can wholeheartedly recommend them.
Well, you can group selectors and use as much shorthand code as possible. Thousands of declarations does sound quite a lot to me and I have the feeling you could reduce that a lot by approaching the markup differently, i.e. creating basic containers and styles and making those reusable.
In my mind, the best approach is a solid balance between scalability and performance.
It might be worth looking into YSlow if you want to reduce your page load times.
However, all of the suggestions raised are to do with front-end optimisation. In my experience it’s definitely worth using a profiler on slow parts of your website to see if there are any issues with your code. For compiled frameworks like .NET this can help tremendously.
It’s mainly due to the CSS switcher, it comprizes over 200 different backgrounds with over 8 variations styles and 7 different colors. To add to all that it’s both responsive and retina-ready, so this did get a little bit complex. Normally I would not have so many declarations, but with everything said I hope it’s understandable that this was not my choice. Typically I would end up with 50KB for the CSS, for this template it’s over 100KB! So we need to fix this somehow.
I also found that when placing the external jQuery on the bottom of the page (just above the closing body) I get FOUC, something which I did not get before.
There are around 15 different script files. I used the minified version of them where possible. In addition to this I used head.js. I was recommended another tool by a friend - jQuery validation plug-in. I don’t know how to use this and it’s not something I feel will be straight-forward as it’s purpose is validation and not minifying.
What I liked about head.js is that I could still keep the files in their natural state and it would run them in a single file one it’s own. The only downside to this plug-in is that it does not minify the scripts. I could minify each CSS and JS individually which would probably save more space. A dream plug-in would be one that combines and minifies at the same time.
I have used spites a couple of time. This would be another cool way to speed up the website. Anything helps at this stage. Not entirely sure you’d do the CSS. Having spites would limit the http requests to files. Maybe another plug-in might be the solution here. CSSSpites are good, but from what I found you’d end up downloading a huge image you can do without. For instance, the template comprises of over 200 different backgrounds, some small other’s big. A single CSS spite would limit the request but force a complete download.
The trouble is that there seams to be plug-ins for everything, and including all of these would further reduce the speed of your web page. A little like downloading a program to speed up your computer, which indoubtedly further slows your machine down.
For the images I use FastStone Resizer, which works wonders. It manages to reduce the file size of those images to something a little more acceptable. Maybe this combine with CSS Spites might further help the situation.
Because it’s a template there will be no database. As far as I know the only thing I can do from the server side it to GZIP it and have it run on a CDN, something like CloudFlare. The CSS is as short as it could be, but maybe there is a little more I can do. Might also be worth using prefixfree, which would probably further reduce the file size of the overall CSS file.
I’ve used YSlow before. It’s a great tool for isolated different issues. I will certainly use this tool. The only drawback I found with this is that it needs to be uploaded to a live environment to see it fully working.
Just wondering, would validation also play a role to the download speed of a web page. I tried to see what Facebook and Yahoo do, but they all have unvalidated web pages, and they seam to have some strange scripts going on there. Certainly not what I would call readable.