I have - and the test page ranked well until it was decommissioned.
In fact the test page had a “density” score of 20% of all words in Title, Meta Description, Meta Keywords (when they held the possibility of ranking), H1-H6, Bold, Links, and inside <p> tags.
It’s the same for 0% “density” pages where we got the page to rank PURELY on back-links alone.
These examples are just that - examples. Correlation does not imply causation, and I’d be happy to accept that these tests, if run today, would in all probability produce different results across different search engines.
With experience, and a pinch of cynicism, you tend to lump/stereotype people who promote keyword density along with those who hold on to the importance of Toolbar PageRank to ranking.
That was my point. People use the term “keyword density” in terms of a percentage, where a notional threshold that has no statistical basis for fact keeps being perpetuated.
And what happens if the density becomes 4.5% or 7.2%? Penalty? Banishment? Keyword density is a myth - frequency and placement is the only thing you need to worry about with keywords.
I’d agree normally, but the OP is using the full phrase as links to other pages, since the anchor text of a link is a strong indicator for relevancy of the targeted page i’d still recommend using more than just the language names.
I wouldn’t think so - you are only qualifying the links. Including the word ‘programmers’ gives the search engines (and users) a clear description of the links, that without it could mean any number of things relating to those languages.
If possible (and I haven’t looked myself) see if there are any synonyms that you could use to reduce the frequency of that particular term without diluting the links.
In the first case, though the KW density was high, other factors had played quite a good role, too.
In the latter one, what happens if you stop building any more links for the page with 0% density. BAAAM!!! It suddenly gets out of the “race”. Unless and until you have something that the person is searching for, I don’t think Google would even give the site it’s dust.
I believe you need to check your main competitors, then calculate average density and make sure you have average keyword dencity according to your SERP competitors.
Yay, a keyword question. Simple answer is “as many times as you would if search engines didn’t exist”. Seriously.
No mate it’s not, this has been thrashed out time and time again and the answer we all came to was that there is no magic percentage that will give you a boost but if you exceed an acceptable percentage (and that will vary by page depending on other factors) you will get penalised for keyword stuffing.
Any thing is no completely clear about Google algorithm. Just a single thing and that is try to make all the content in the natural way, and the use of programmer in you navigation is very natural instead u just write the technologies. I think it will be appreciated by the Google to consider the good content.