But when I run a speed test it tells me
Leverage browser caching for the following cacheable resources:
http://www.graemeevanslandscapes.co.uk/images/bodenham2 (Custom).jpg (expiration not specified)
http://www.graemeevanslandscapes.co.uk/images/bodenham3 (Custom).jpg (expiration not specified)
http://www.graemeevanslandscapes.co.uk/images/bodenham4 (Custom).jpg (expiration not specified)
http://www.graemeevanslandscapes.co.uk/images/bodenham5 (Custom).jpg (expiration not specified)
http://www.graemeevanslandscapes.co.uk/images/caravan1 (Custom).jpg (expiration not specified)
http://www.graemeevanslandscapes.co.uk/images/civil2-thumb.JPG (expiration not specified)
...
It’s the same caching I always use, on the same server, and it’s normally ok
Just uploaded another website to the same server and using same directory structure.
I copied the same .htaccess from previous site and just changed the name in the rewrite rule (www.graemeevanslandscapes.co.uk → www.motortraining.co.uk)
This time google agrees I have enabled browser caching!
Just can’t see where the difference is…
In my experience, occasionally Google is just … wrong.
Webmaster Tools started reporting that the robots.txt file on one of my sites was unreachable. I’d made no changes to it (or the site) and Google had been accessing it without issue for over a year. I could access it OK; fetch as Googlebot could not. I tried everything, even deleting it and uploading a new copy. Nothing would work. And then one day (about three months later) the problem vanished as mysteriously as it appeared.
Oops, I 'd love to blame google, but this time was my fault! I forgot the image directory is aliased and is actually a level below the public folder where the htaccess is!
I’ve not seen that with the robots file, but I do got a weird one every now and again where a fetch as google comes back as partial with some assets not available. Try it enough times and like your robots issue, the problem mysteriously vanishes…
i have been designing sites for nearly 20 years, only recently have google penalised us for doing stuff wrong.
This “Leverage Browser Caching” has always caught me out and i have never been able to remedy it.
i have altered my HTaccess 100’s of times, thinking it was wrong…
i have been designing and checking web pages in sub folders of sub folders, as you do, use other domains before going live, and all this time i was correct. i feel stupid for searching for something which wasnt a problem. To fix the Leverage Browser Caching ALL you do is move the designed page to the main directory.
Thank you very much for saving my time.
i must add i have never read this before
The way I understand it .htaccess affects everything on it’s own level and below (sub folders), so you could have other .htaccess files in the sub folders which override in those folders only.