Well I’ve said it for ages, keyword density doesn’t matter unless you have too many… and now it’s officially confirmed by Google.
“The Keywords list will sometimes exclude words that we determine to be boiler plate text or common words. This varies from site to site. The Keywords list is a starting point to see how Google is interpreting your site’s content. This list should be evaluated in tandem with what’s listed in the Top Search Queries report for your site as well as how your site appears in the actual search results for the keywords you’re targeting.”
Jonathan Simon of Google
So, if you use your keyword phrase too many times, it will be ignored. This isn’t a negative impact like a punishment or anything but if Google is ignoring your main keyword phrase that’s something of a problem when it comes to ranking for it…
Boiler plate and commons words (excluded from the Keywords list) vary from site to site. It sounds like for your site we could be doing a better job here.
The more important question though is if this negatively impacts your site’s ranking as you mention. It doesn’t.
I’m not really sure how to interpret their conversation. When I think of “boilerplate text” and “common words” I don’t think of keyword density. It would have been easy for JS to use those terms if that’s what he meant, I guess.
Not sure what your point is mate, I pointed out in my OP that it’s not an actual penalty like -6 or something but if they’re ignoring your main keyword phrase it might as well be a ranking penalty right cos you sure aint gonna rank for a keyword that Google is ignoring are you. Frankly I’m just getting a kick out of being right again, it always made sense to me for Gooogle to do this and I’ve been saying for years that having too many keyword phrases would be a trigger for some kind of action, now it’s officially confirmed.
“Boilerplate - a boilerplate is a unit of writing that can be reused over and over without change.”
Not something Google want to encourage in their quest for user friendly sites to present to their users.
Exactly. Nowhere in the linked Google Webmaster Help thread was keyword density mentioned. The discussion is about “boilerplate text” and “common words”.
You can choose to interpret the discussion in a way that supports your argument about keyword density, but I don’t think that that’s what they’re talking about here.
Ah I see what you’re saying. Your’e right, it doesn’t use the word density but what it makes clear is that if you overuse a keyword phrase or text element to the point that Google considers it ‘boilerplate’ then it will ignore it. That’s the same thing as establishing a density for that text that is too high since the more you use something, the higher percentage of the text it wil account for.
Plus, SEORoundtable consider it a keyword density issue which is some pretty highlevel support for the theory and this whole issue was sparked by this - "a webmaster was asking why the most used words on his site is not showing up in the keywords report in Google Webmaster Tools."
The most used words on his site… since a keyword by definition is one that’s ‘key’ it’s likely to be used a lot and in important places therefore bumping up it’s relative density. This is a keyword density issue.