How often can I use my keyword on a page?

On a website I am working on, I have links to many categories on the home page. But each link has the same word in it.

For example, if I have the word “programmer” as a keyword. Am I keyword stuffing if my homepage has links to different pages that look like this?

PHP Programmers
ASP Programmers
SQL programmers
VB programmers
HTML programmers
etc. (I have about 12 different categories)

In addition, I show the exact same links in the sidebar, on every page. To the home page has twice the amount…

Should I change this?

I have - and the test page ranked well until it was decommissioned.

In fact the test page had a “density” score of 20% of all words in Title, Meta Description, Meta Keywords (when they held the possibility of ranking), H1-H6, Bold, Links, and inside <p> tags.

It’s the same for 0% “density” pages where we got the page to rank PURELY on back-links alone.

These examples are just that - examples. Correlation does not imply causation, and I’d be happy to accept that these tests, if run today, would in all probability produce different results across different search engines.

With experience, and a pinch of cynicism, you tend to lump/stereotype people who promote keyword density along with those who hold on to the importance of Toolbar PageRank to ranking.

That was my point. People use the term “keyword density” in terms of a percentage, where a notional threshold that has no statistical basis for fact keeps being perpetuated.

It is now [B]commonly [URL=“http://www.searchenginepeople.com/blog/how-search-really-works-the-keyword-density-myth.html”]accepted [URL=“http://www.seomoz.org/ugc/seo-myths-that-persist-keyword-density”]and [URL=“http://www.miislita.com/fractals/keyword-density-optimization.html”]agreed[/B] that referring to “keyword density” as it has been used, is a redundant theory. If people want to stick to this methodology they would be better off using the terms “keyword placement/proximity” and “keyword frequency” which are more of a metric around using the targeted terms (and it’s stems/synonyms) in terms of [URL=“http://en.wikipedia.org/wiki/Natural_language_processing”]Natural Language Processing according to [URL=“http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&p=1&f=G&l=50&d=PTXT&S1=7,426,507.PN.&OS=pn/7,426,507&RS=PN/7,426,507”]phrase-based information retrieval.

And what happens if the density becomes 4.5% or 7.2%? Penalty? Banishment? Keyword density is a myth - frequency and placement is the only thing you need to worry about with keywords.

I’d agree normally, but the OP is using the full phrase as links to other pages, since the anchor text of a link is a strong indicator for relevancy of the targeted page i’d still recommend using more than just the language names.

I wouldn’t think so - you are only qualifying the links. Including the word ‘programmers’ gives the search engines (and users) a clear description of the links, that without it could mean any number of things relating to those languages.

If possible (and I haven’t looked myself) see if there are any synonyms that you could use to reduce the frequency of that particular term without diluting the links.

In the first case, though the KW density was high, other factors had played quite a good role, too.

In the latter one, what happens if you stop building any more links for the page with 0% density. BAAAM!!! It suddenly gets out of the “race”. Unless and until you have something that the person is searching for, I don’t think Google would even give the site it’s dust.

PS: No personal attacks meant!

Cheers!!

I believe you need to check your main competitors, then calculate average density and make sure you have average keyword dencity according to your SERP competitors.

Yay, a keyword question. Simple answer is “as many times as you would if search engines didn’t exist”. Seriously.

No mate it’s not, this has been thrashed out time and time again and the answer we all came to was that there is no magic percentage that will give you a boost but if you exceed an acceptable percentage (and that will vary by page depending on other factors) you will get penalised for keyword stuffing.

You take that to 10%. And see.

Don’t forget that Google uses “200+” signals to judge a site.

Karan

Imagine someone using a screen reader, you’d soon get very tired of it reading aout the word “programmer” when reading the skills list.

I’d add a suitable heading tag above the list and drop the “p” word

<h3>Find programmers in</h3>
<ul>
<li>PHP</li>
<li>ASP</li>
<li>SQL</li>
<li>VB</li>
<li>HTML</li>
</ul>

That would give you a much cleaner looking list and easier to scan read.

why don’t you try submitexpress.com to check you meta tags as well as check your KW density on gorank.com

Any thing is no completely clear about Google algorithm. Just a single thing and that is try to make all the content in the natural way, and the use of programmer in you navigation is very natural instead u just write the technologies. I think it will be appreciated by the Google to consider the good content.

I don’t think there’s any problem using a single word multiple times on a page.

But, it is a problematic thing if you are using a group of words a no. of times. Try and keep the density to up to 5-7%.

Karan