FAQ: Search Engine Optimization

NOTE: This post is being reviewed. Be aware that the links don’t work. The info is still good though.

This thread/post is design to help answer the most common questions here and also to help spark some debate. The answers below are not perfect nor do I claim they are. I’m hoping for feedback to help make this more complete and accurate so please feel free to point out any mistakes, omissions, or otherwise. If I missed a question you feel is common, let me know so I can add it here.

(Since SEO is not a science this is a gutsy post to make so cut me some slack! :))

Google now has its own FAQ. Click here to view it.

Build your website for human beings, not search engines!

What are SERPs?
What is anchor text? Why is it important?
How do I get a lot of backlinks to point to my site?
How do I get a lot of backlinks to point to my site?
How many keywords should I put into my <title>, <a>, and <h1>…<h6> tags?
Meta Tags
What happens if I use includes for my pages? Will the search engines see them?
Should I submit my website to the Search Engines by hand or use software?
How often should I submit my website to the search engines?
How often should I submit my website to the search engines?
Sites with .com rank higher then with <TLD here>
Pages with query strings don’t rank as well as without query strings
Should I use relative links or absolute links?
I just changed from .html to .php. How can I switch without losing my rankings?
I just changed my domain name. How can I switch without losing my rankings?
Why aren’t all my pages being indexed?
How do I check if my site is search engine friendly?
What does it mean to have your site indexed by the search engines
Which is better for domain name and/or url: hyphen (-), underscore(_), or plus sign(+)?
Will too many hyphens in your domain name cause the search engines to label your site as spam?
Does the order of the keywords used in a search affect the search results?
Does the order of the keywords in a domain name/URL matter?
Does using automated SEO software cause a website to be penalized?
Can search engines see password protected pages?
Which is better for SEO: text links or graphical links?
Does validation help your ranking?
Can the search engines read javascript?
What is a quality link?
Why should I not launch an incomplete website?
What is referrer spam?
What is a doorway page?
Hidden Text/Hidden DIVs
Does changing hosting affect my ranking?
What is a “good crawlable design”?
Flash and SEO
Is Ajax bad for SEO?
Do outbound links help my rankings?
Does a page’s traffic affect its rank?
What about reciprocal links?
What keyword tools should I use and how do I use them?
How do improve my rankings for country specific search?
What directories should I submit my site to?
Common SEO Myths
What is the story with Alexa?
What would be a good SEO strategy?
Good SitePoint Articles that you might find useful:
Google based tools
Yahoo! based tools
MSN based tools
Keyword research tools
Link research tools
Other Tools

What are SERPs?

SERPs is an acronym for Search Engine Results Pages. Basically they are the search results you receive when doing a search at a search engine.

What is anchor text? Why is it important?

Anchor text is the visible hyperlinked text on the page. For example, let’s examine this code:

<a href=“http://www.sitepoint.com/forums/”>Webmaster Forums</a>

The anchor text for this link is “Webmaster Forums”. This is important in search engine rankings because the search engines use anchor text to help determine the relevance of a page being linked to for those keywords. By having this link pointing to their forum’s web page, SitePoint Forums will perform better in searches for the phrase “webmaster forums” (and other similar phrases as well).

How do I get a lot of backlinks to point to my site?

A good place to start is to submit to directories. Start with the free ones and then decide whether pay ones are worth it for you. Here’s a great place where you can find a free directory listing that sorts them by PR, Alexa rank (worthless), and more: http://www.tipsntutorials.com/Top-Directories/ and also check out http://www.isedb.com as it lists thousands of places you might find worth submitting to.

How do I get a lot of backlinks to point to my site?
I was thinking of doing <seo trick here> to my site but I’m afraid the search engines might think it is spam. Should I do it?

No. Why? If you’re not sure if it will get you in trouble with the search engines or not then it’s probably something you shouldn’t be doing. Another good reason not to do it is accessibility. Many webmasters employ hacks and tricks in an effort to increase their search engine ranking. Often times these tricks are at the expense of the usability of their website. Not only to those who have disabilities but to anyone who’s trying to navigate their site.

How many keywords should I put into my <title>, <a>, and <h1>…<h6> tags?

You should only put the few keywords that are most relevant to your pages. The more you put in each tag, the more you dilute the value each keyword is given.

<h1>Advanced PHP Programming</h1>

is better then

<h1>Advanced PHP Programming Is Really Cool And Stuff Dude</h1>

Meta Tags

The large majority of search engines do not use Meta Tags as part of their ranking algorithm. Some will claim Google uses Meta tags in its algorithm. This is entirely untrue. Google, however, will use a meta description tag if it is unable to discern a description for a webpage on its own (if the page has no text and no description in the open directory [dmoz] it is likely Google will use the meta description tag in its SERPs). Please note that it is only using this description in its SERPs, not its algorithm.

Should you use Meta Tags in your site? Yes. They do have some affect in some search engines and even though that effect is almost zero it is still more then zero so is worth the time.

How much time should I spend on my Meta Tags? Ten minutes. Write a nice concise description of your page and throw in a sampling of keywords (which you should have handy if you’ve optimized your pages properly). You should spend no more time then this on them. Use your time to promote your site and get quality inbound links.

How many keywords should I use? As many as you want. If you start to think you may have too many, you probably do. This means you need to divide your page into subpages with each one taking its own topic.

What happens if I use includes for my pages? Will the search engines see them?

The search engines don’t care about what server side technology you use. All they see is the (x)HTML your server side code generates. To see what they see simply load your page in your favorite web browser and then view the source. What you see is exactly what they see.

Should I submit my website to the Search Engines by hand or use software?

Do it by hand. It will not take long to do and will ensure that you are successful in submitting each form with the correct information. There is a constant debate about how search engines feel about automated submission software. Since there is a reasonable chance these are frowned upon by the search engines, and since you can do anything they can do on your own, you might as well avoid them.

But, if you’re going to use software, these title seem to be the most common ones recommended: Addweb, [URL=“http://www.webposition.com/”]Web Position Gold, [URL=“http://www.webceo.com/”]Web CEO

How often should I submit my website to the search engines?

Once. Resubmitting your url does not get you indexed faster or improve your rankings. Also, resubmitting your site will never cause your site to be banned. If so, then all you would need to do is submit your competitors’ sites repeatedly until they were banned.

How often should I submit my website to the search engines?

This is a very common myth that is 100% untrue. The file extension does not affect your rankings in any way. After all, no matter what server side programming language you use, and what extension you choose to use with it, they all just spit out HTML in the end. That’s all a web browser will see and that all a search engine will see.

Sites with .com rank higher then with <TLD here>

This is another common myth that is untrue. The only time a domain extension can affect your ranking is if the search is based by country. The country-specific TLDs (e.g. .co.uk) will have priority over non-country specific TLDs (e.g. .com or .net).

One observation many make is that .coms tend to rank higher then other domain extensions. They assume it is because .coms are given preferential treatment. This is a poor assumption. .coms seem to rank higher then other extensions because they are by for more popular then any other domain extension (there are more .coms than .net, .org, .biz, .edu, .gov, and .info combined) so they naturally have a greater chance of ranking higher vs other domain extensions through sheer quantity alone. .coms also tend to be older sites so they have had a chance to establish themselves whereas newer domain extensions have not. They have also used this time to acquire more backlinks which is an important factor in search engine algorithms.

It is also commonly believed that .gov and .edu sites are given preferential treatment from search engines. This is also untrue. Web pages on .edu and .gov domains tend to rank well because they contain quality content and many webmasters will link to their content as a result. Both of these are key elements in SEO. But the fact that they are .edu or .gov domains does not benefit them directly in the SERPs.

Pages with query strings don’t rank as well as without query strings

Another common myth that is untrue. The only way variables in a query string can affect a site in the SERPs is if it has a sessionID or something that looks like a sessionID in it (e.g. id=123456). These usually prevent indexing of these pages or limit the amount of pages indexed. But query strings do not affect the page’s ranking. Neither in a positive way or negative way.

Should I use relative links or absolute links?
Should I use relative links or absolute links?

Absolute links. It is recommended by Google as it is possible for crawlers to miss some relative links.

I just changed from .html to .php. How can I switch without losing my rankings?

There are two ways to do this:

  1. Tell Apache to parse all .html files a .php file. Using this method you do not have to change any files extensions or worry about any redirects. To do this, place this code in your httpd.conf file:

AddType application/x-httpd-php .php .html

  1. Use a 301 redirect to redirect from the .html files to the .php files. You can do that by placing this code in the root directory of your website:

RedirectMatch 301 ^/(.*)\\.html$ http://www.yourdomain.com/$1.php

I just changed my domain name. How can I switch without losing my rankings?

You’ll need to do a 301 redirect from the old domain to the new domain. Fortunately this is not difficult to do. You’ll need to add the following lines of code to a file called .htaccess and place it in the root directory of the old domain:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www\\.)?old-domain\\.com$ [NC]
RewriteRule ^(.*)$ http://www.new-domain.com/$1 [R=301,L]

Why aren’t all my pages being indexed?

If your site is less then six months old stop reading now. Your site is too new to be worrying about getting all of your pages indexed. Be patient. It takes time to crawl through your whole website and add your pages to the index. If you are sure your pages are search engine friendly then you have nothing to worry about.

If your site is six months old or older you need to check your website to make sure all of your pages can be found and indexed. Have you:

  1. Made a human sitemap?
  2. Made a Google or Yahoo sitemap?
  3. Used search engine friendly URLs?
  4. Used search engine friendly navigation?

An additional note: get incoming links. These are important for the search engines’ algorithms and may play an important part in how deep the search engines will crawl your website.

How do I check if my site is search engine friendly?

Turn off JavaScript, CSS, and cookies in your web browser and view your website. This is how the search engines most likely see your website. If you can successfully view your content and navigate your website your site is mostly search engine friendly. The only other thing to check is your URLs. Not using a session ID or ‘id=’ in your query strings is also very helpful.

What does it mean to have your site indexed by the search engines

To be indexed by the search engines means your webpages have been crawled and included in the database of the search engines. Your pages are now available to be included in search results of user queries. This doesn’t mean your pages are guaranteed to be included. It just means they are available. The pages will still need to be relevant to the search terms before they will be included in the SERPs.

Which is better for domain name and/or url: hyphen (-), underscore(_), or plus sign(+)?

Hyphens and underscores are the best keyword delimiter you can use in your domain name or URL. They are seen as equal by all of the major search engines.

Many say that separators are not necessary as search engines can find keywords in URLs without assistance. They are smart and most likely can pick some keywords out of a URL. But they are not that smart. Sometimes it is not obvious where one keyword ends and another begins. For example: expertsexchange.com can be seen as “experts exchange” and “expert sex change”. These are obviously two very different topics. In this case a hyphen or underscore would clearly separate the keywords and solve this problem.

Will too many hyphens in your domain name cause the search engines to label your site as spam?

No. This is a myth caused by many spam sites using multiple hyphens in their domain name. Many people have wrongly concluded that only spam sites would need to use more then one hyphen. The truth of the matter is that having more then one hyphen in your domain name will not result in your site being penalized. The more likely scenario is that having multiple hyphens will result in a flag being set at the search engines and a manual review being done to see if the site is spammy or legitimate.

One thing to keep in mind when choosing a domain name with hyphens in it: you users. When using a domain with multiple hyphens you make it more difficult for your human visitors to remember and type in your domain name. Domain names with more then one hyphen should only be used if you are attempting to market your website through the search engines. If you plan on doing offline advertising, including word of mouth, one hyphen or less is recommended.

Does the order of the keywords used in a search affect the search results?

Yes. Do a search and see for yourself.

Does the order of the keywords in a domain name/URL matter?

Yes. You will typically rank better in the SERPs for the phrases that use the words in the same order as your domain and URL then if they are not in the same order.

Does using automated SEO software cause a website to be penalized?

No. This is a common myth that is untrue. If it were true you could get your competitor penalized or banned by using automated SEO software to resubmit their website every 60 seconds. Naturally this does not happen (nor should it).

Some webmasters will try to say that Google says in their guidelines that you shouldn’t use automated software like Web Position Gold. The reason for this is that most of these tools scrape Google’s SERPs to find your site’s ranking information. This is in violation of Google’s terms of service. Software that uses Google’s API is acceptable for querying their servers. Also, if you constantly use SEO software to query a search engine’s servers you might find that they ban your IP address to prevent you from using their resources any further. However, this has no effect on your web pages’ rankings.

Can search engines see password protected pages?

Search engines are not different from regular users in most ways. They cannot go anywhere that a regular user cannot go. If you have a password protected area of your website that cannot be accessed without a login and password then the search engines cannot see it.

Which is better for SEO: text links or graphical links?

Text links are better for SEO. Text links can contain the anchor text that your page wishes to rank well for and that is an important factor in all three search engines, especially Google. Image links are still valuable but do have less benefits compared to text links. This is true despite image tags having the ALT attribute available. The ALT attribute can contain keywords but thanks to keyword stuffing they now are virtually worthless. (You should be using your ALT attributes for usability and accessibility and not SEO anyway).

Does validation help your ranking?

Short answer: No.

Longer answer: No. But having a webpage that validates is a good idea. A webpage that has been validated to a W3C standard contains no errors and therefore can be easily parsed and understood by the search engine crawlers. An invalid webpage runs the risk of being misinterpreted or just not read at all.

Can the search engines read javascript?

Probably not. But we can’t say no for sure because some JavaScript is so easy to read it is hard to imagine that it does not get interpreted. An example of an easy to interpret snippet of JavaScript would be:

<script type="text/javascript">
 document.write('<a href="http://www.example.com">Trying to hide this link from search engines</a>');

To ensure that the search engines don’t read your JavaScript it should be inserted into a web page using an external file and that directory should be blocked using robots.txt.

What is a quality link?

A quality link is:

  1. On topic (The page linking to your page is about the same main topic)
  2. Ranked well for the keyphrase you are after (In the top 1,000)
  3. Contains the keywords you wish to rank well for
  4. Has high PR (PR 4 or higher)

I left out high traffic because that is irrelevant from an SEO point of view. But if you’re looking at the big picture that would be #5.

Why should I not launch an incomplete website?

  1. Users will remember that your site was incomplete and will be less willing to come back

  2. Search engines may index incomplete pages and cache them and then not refresh their cache for months or years

  3. Other webmasters will not exchange links with incomplete sites

  4. Directories won’t accept submissions from incomplete sites

Keep in mind this generally covers your “under construction” kind of incomplete sites. You certainly can launch a site and then continually add to it and grow it. Even adding whole new sections. But a site that is obviously incomplete just shouldn’t be set loose in the wild until it is ready to go.

What is referrer spam?

Referrer spam is when a spammer sends fake referrers to your server. They do this because they know most web stats lists referrers as hyperlinks. They then submit your stats to the search engines in the hopes that they will crawl your stats and find that link. They also hope you click on that link yourself.

What is a doorway page?

From Wikipedia

Doorway pages are web pages that are created to rank high in search engine results for particular phrases with the purpose of sending you to a different page. They are also known as landing pages, bridge pages, portal pages, zebra pages, jump pages, gateway pages, entry pages and by other names.
What is cloaking

From Wikipedia

search engine optimization technique in which the content presented to the search engine spider is different from that presented to the users’ browser; this is done by delivering content based on the IP addresses or the User-Agent HTTP header of whatever is requesting the page. The only legitimate uses for cloaking used to be for delivering content to users that search engines couldn’t parse, like Macromedia Flash. However, cloaking is often used to try to trick search engines into giving the relevant site a higher ranking; it can also be used to trick search engine users into visiting a site based on the search engine description which site turns out to have substantially different - or even pornographic - content. For this reason some search engines threaten to ban sites using cloaking.

Hidden Text/Hidden DIVs

Hidden text/DIVs are only bad if you are using them to manipulate the SERPs. There are many practical uses of hidden text/DIVs that enhance a web page without being malicious.

Good uses of hidden text/DIVs: Dynamic menus, dynamic page content

Bad uses of hidden text/DIVs: Text that is present on the page but cannot be viewed by human beings at any time

Does changing hosting affect my ranking?

No. Your webhosting does not affect your rankings. You can change hosts without it affecting your rankings. The only issue you may run into is if you fail to make a smooth transition to your new webhost. Downtime will naturally prevent the search engine crawlers from crawling your site properly. Extended downtime may cause indexing issues.

To switch hosts properly follow these easy steps:

  1. Set up your website on your new webhost

  2. Change your DNS to point your domain name to your new webhost

  3. Leave the website up on the old server for at least one week to make sure DNS has propagated completely. After one week you can safely take down the site from the old server.

The most common mistake users make when switching hosts is not leaving the old site up while DNS propagates. Make sure you don’t wait until the last minute when switching hosts or you may run into trouble.

What is a “good crawlable design”?

  • Don’t use flash - flash is SEO suicide. Some say the search engines can crawl flash but even if they can they certainly can’t crawl it as well as HTML. (See below).
  • Don’t use JavaScript to create page content - For the most part, search engines don’t read JavaScript. If you use JavaScript to create your pages’ content it is as good as not being there when the search engines come around.
  • Interlink your pages - search engines find your pages by following other links in your pages. Be sure to link to your pages liberally especially important pages.
  • Use search engine friendly URLs - Although search engines can crawl query strings just fine, using a URL that appears static is a good thing. Errors can occur on long or complex query strings and this eliminates that possibility plus it is a great chance to get keywords into your URL
  • Use semantic markup - HTML is a powerful tool that search engines use to determine the context of a page (that’s another reason why flash sucks for SEO: no html). Use HTML properly to give keywords more weight within your pages. See the Search Engine Optimization FAQ for more on that.
  • Use a sitemap - sitemaps make sure your pages are easily found by the search engines (good for humans, too).

Flash and SEO

An all Flash website is handicapped versus a semantic website (HTML). Even optimizing non content aspects of your pages will still put an all Flash website at a severe disadvantage.

The problems with using Flash include:

  1. It’s a one page site. How many one page sites do you know that rank well?

  2. You lose the power of semantic markup. No HTML = no clues for the search engines as to the importance of keywords.

  3. Expanding on point 2, you don’t have any anchor text since you don’t have any internal links. That just kills you in Google.

  4. There isn’t a whole lot of proof that the search engines can read flash as well as HTML.

You have only one available tool for trying to SEO the site and its effect is minimal. Put alternative content between the <object> tags. This has the same effect as the <noscript> tags for JavaScript.

If you are making an all flash site, your only real hope is to try to be successful in a massive incoming link campaign. Otherwise you have to target marginally competitive keywords or niche keywords as you virtually don’t have a prayer of ranking for anything even remotely competitive.

Your only other option is to create a second version of the site so it can be read by search engines, users with accessibility issues, and users who don’t have flash. Of course you’ve doubled your development costs by doing this as you have two websites to maintain now.

Is Ajax bad for SEO?

Any content available via Ajax should also be available without Ajax if a site is designed properly (i.e. accessible). That means search engines should still be able to access that data even though they don’t support Ajax/JavaScript. If you cannot then it isn’t a flaw in using Ajax, it is a flaw in the development of the site.


Robots.txt files do not directly have an affect on SEO as commonly believed. Robots.txt files are used to tell search engines (or more specifically, user-agents) what content they are not permitted to access (and thus index). This is usally to prevent sensitive data from being found and indexed (admin control panels, etc.). If you do not wish to block any files from the search engines then you do not need to use a robots.txt file. Having one will not improve your rankings by itself nor make your site more attractive to the search engines. In fact, you should only use one if you absolutely need it as an error in your robots.txt file may result in important pages not being crawled and indexed and you will never know unless you check your file for errors at some point in the future.

If you do want to use a robots.txt file to prevent 404 errors in your logs make this the only content in your file:

User-agent: *

More on robots.txt including how to use one: http://www.robotstxt.org/wc/faq.html

Do outbound links help my rankings?

No. This is a common myth that is untrue.

The whole outgoing link thing has been mentioned by amateurs going on 4 or 5 years now, no one has ever proved it, and many have shown evidence of it not mattering. Newer people to SEO tend to see Google’s goal as one to police the webmaster community and make sure everyone plays fair. Google is not a referee, they are a search engine, they care about serving relevant results.

Newer people to SEO also fail to understand what PageRank is. PageRank is a measure of perceived page quality, if an outgoing link adds to a page’s quality it will get more incoming links and thus rank better. If it doesn’t add to a page’s quality, no bonus will be had. There is absolutely no reason for Google to second guess themselves and add arbitrary blanket bonuses or penalties to all sites because of a perceived notion of a certain attribute always making a site better or worse. So, in short, because Google measures incoming links, they have no need to measure outgoing links, or anything else that supposedly marks a site as having a higher “quality.” in the end, if it truly does have a higher quality, it’ll get more incoming links naturally.

Then there is the fact that outgoing links are strictly under the control of the webmaster, like meta tags, and so assigning them any weight leads to the same problems that brought around the downfall of meta tags.

Finally, there are all the thousands or millions of sites and pages that rank perfectly well without any outgoing links. Certain types of sites, such as blogs, normally have outgoing links and it would look abnormal for them not to. However most other site types normally do not have outgoing links and haven’t traditionally had them, going back to the 90s, long before Google came about. Most business, commercial, ecommerce, or service sites do not have outgoing links. Not because they’re hoarding PR, but because they’re trying to sell something and do not want to distract from the user experience or send users away.

You must not remember a time before incoming link algorithms. In those times to measure quality search engines had to guess based on on-page factors, and it was hard to impossible. The invention of incoming links, or PageRank, search engines had a perfect way to measure the quality of a site, and so then only had to discern topicality. Why would they take a step backwards and again start using on-page factors to measure quality?

Google has a lot of smart people working for them, they realize that if external links truly do add to the usefulness of a site then that site is already receiving a bonus because more useful sites garner more incoming links. This is also true for anything else that supposedly adds usefulness. They aren’t going to say “Hey, we have this really good algorithm here, but lets second guess it and make an assumption that pages without outgoing links need to be penalized for being less useful.” Why would you ever make an assumption about something that you can already discretely measure?

Also, do not forget, Google itself created the nofollow link attribute to give webmasters an easier way to block links.

In the end, if Google did give value to external links, it’d be meaningless. As soon as it was confirmed (which no one has been able to do) all the spammers and everyone else would just add one or two links to their pages. It would do nothing to increase relevance.

Does a page’s traffic affect its rank?

No and here’s why:

  1. The search engines don’t have access to the data they would need to use this as a ranking factor. They do not know how much traffic a web page gets, as it is not publicly available, and thus cannot use it to determine its rank. (For those of you who want to say, “But there is Google Analytics”, that service is used only by a small percentage of websites and unless every web site decided to use it on every web page the data is far too incomplete to be used this way).

  2. It would be a self-fulfilling prophecy if the search engines used its own SERPs as a means of determining it’s search engines results. Obviously the number one ranked page for a search is going to get more traffic than a page not on the first page. If traffic was the indicator of where a page belonged there would be little or no way for a page to ever move up simply based on the fact that the pages ranked higher would be receiving more traffic from the search engine based on the mere fact that they are ranked higher.

  3. Traffic volume can be manipulated. Spammers and black hats could easily write bots to artificially inflate their page views and thus their rankings. Plus you can purchase traffic from traffic providers or buy expired domains and redirect them to your site. It would just be too easy to do. (I can also see newbies hitting refresh for hours on end…)

  4. Traffic is not an indicator of quality content. It is only an indicator of good marketing.

What about reciprocal links?

In general, reciprocal links are bad for SEO and should be avoided. Here’s why:

  1. They are a clear attempt to manipulate the search results which is a big no-no. That’s why Google specifically outs them in their webmaster guidelines. Basically they see it as vote swapping. If you have an excessive amount of reciprocal links you run the risk of incurring penalties. (No one knows how many it takes to incur a penalty so it isn’t wise to push your luck).

  2. You risk being considered part of a link farm. If you link to a website that is considered a link farm and they link back to you, you may be seen as being part of the link farm. Link farms violate the search engine’s TOS and are a quick way to get banned.

  3. The links themselves carry virtually no value, or worse, cause you to lose strength from your pages. Because the links are on unrelated pages or pages that have little value for your niche (e.g. wrong context) prevents them from holding any value in the search engine’s eyes. What little value they may have gets lost when you send a link back to their website thus negating any value that link may have had. Even worse, if your link is “worth” more then their link you will actually be hurting your site with that link exchange.

  4. Many webmasters are dishonest and will remove your link or hide it from the search engines. No incoming link means no gain for you.

Link exchanges should be saved for websites in your niche that are well established and ahead of you in the rankings. A great suggestion in the SEO Guide is to do a content link exchange with related sites (see [URL=“http://www.websitepublisher.net/article/link-building/4”]Buying Links & Link Exchanges).

What keyword tools should I use and how do I use them?

Good tools to use for keywords research are Google’s Adwords Keyword Suggestion Tool and [URL=“http://www.google.com/trends”]Google Trends. (Find more Keyword research tools)

You have to keep in mind that tools like Google’s Adwords Keyword Suggestion Tools and Wordtracker are not to be taken literally. You are supposed to look at the volume of keyword searches by volume relative to each other and major search terms. That will give you an idea of how frequently a search term is being used. The exact number isn’t important unless you’re conducting trend analysis over an extended period of time. Even then, the exact number doesn’t really offer any useful information. A number within 5% - 10% (or maybe more) of the exact number is just as useful. Those rough numbers will clearly expose which terms are popular and which are not. The fact that Google’s Adwords Keyword Suggestion Tools or Wordtracker showed no results for a keyword means that it’s search volume is extremely low which is all you need to know. Whether it is 1 search or 100 searches doesn’t matter. You now know what kind of volume it has and what to expect in terms of competitiveness and traffic.

For example:

Let’s use these fictitious results for ‘stymieebot’

stymieebot 15000
stymieebot clone 6000
stymieebot repellent 5500
stymieebot stickers 5200
stymieebot t-shirts 4950
stymieebot hoolahoop 300
stymieebot mask 180
stymieebot uzi 15
stymieebot cologne 1

What we can tell is ‘stymieebot’ is clearly the most popular search term related to ‘stymieebot’. The number of searches could be 18,000 or 12,000 and it still would clearly be the primary search term we would hope to rank well for and the most competitive (most likely).

‘stymieebot clone’, ‘stymieebot repellent’, ‘stymieebot stickers’, and ‘stymieebot t-shirts’ make up the second tier of results. They’re grouped relatively close together and their order really is irrelevant. Their order will almost certainly change month-to-month but their average search volume will most likely remain the same. They’ll always be searched far less then just ‘stymieebot’ but still get a decent number of searches each month. Their numbers don’t matter because we know how popular they are relative to ‘stymieebot’ and that they are searched often enough to be worth targeting.

‘stymieebot hoolahoop’, ‘stymieebot mask’, ‘stymieebot uzi’, and ‘stymieebot cologne’ make up the third tier of results. They’re seldom searched for and either will be longtail keywords or ignored completely. The exact number of searches are irrelevant because relative to the first two tiers we can see traffic from these terms will be sporadic at best and can assume they will be easy to target.

How do improve my rankings for country specific search?

To rank better in country specific search you should:

  1. Use the country specific TLD

  2. Host the site in that country

  3. Set the geographic location for the site in Google Webmaster Tools

What directories should I submit my site to?
There are four tiers of directories:

  1. Truly quality directories - these directories are well known, actually used by some people (although not really a whole lot), and their links have decent value (just decent value, not great value). You can count the number of these directories on two hands and probably have fingers left over. These directories include Dmoz and Yahoo. Links from these sites are the most valuable types of links you can get from a directory. However, even then they are not that strong. Links from related websites are much better.

  2. Quality niche directories - These directories exclusively list sites in a certain niche: yours. These directories don’t carry the weight of the first tier but because they are in a related niche they are better then general directories.

  3. General directories with an editorial process - These are your run-of-the-mill, just-like-everyone-else directories that litter the Internet. What separates these from the bottom tier of directories is that these directories actually monitor their listings and try to list only quality sites and reject spam sites and Internet waste. Links from these directories are not worth very much, Basically if you are seeking links from these kinds of directories you are going for volume as opposed to quality. Over time these can be helpful in your rankings for long tail and medium competitiveness keywords.

  4. General directories with no editorial process - These directories accept anyone. They are full of crap sites and probably engage in a lot of link exchanges. These directories are worthless and should be avoided.

Common SEO Myths (These are UNTRUE)

  • Outbound links improve your ranking
  • Submitting your site to the search engines too many times will get you banned
  • Links from .edu and .gov sites are worth more then links from other TLDs
  • Pages with .php extensions don’t rank as well as pages with .html extensions
  • Search engines won’t index pages with query strings/all query strings are not search engines friendly
  • Using seo software will get your site penalized
  • Too many hyphens in your domain name will cause the search engines to label your site as spam
  • Traffic is a factor in a page’s ranking
  • Being a shared IP address will hurt your rankings


Keyword Stuffing - This occurs when a web page is loaded with keywords in the meta tags or in content.

Link Baiting - Writing good content that people will naturally want to link to.

Link Building - Getting more links to point to your webpages. Also known as “acquiring inbound links”.

Link Farm - Any group of web pages that all hyperlink to every other page in the group. Done with the intention of manipulating the SERPs and is usually penalized by the search engines.

Organic Search - The search results determined by a search engine’s algorithm. The opposite of paid inclusion advertising.

Spider - Also known as a web spider, web robot, or bot is a program which browses the World Wide Web in a methodical, automated manner.

What is the story with Alexa?

Alexa’s rankings are generally considered to be inaccurate at best. Their rankings depend on a user having their toolbar or their spyware installed in order to track their surfing habits. Plus their software is limited to the Windows operating system further limiting the reach of their software and accuracy of their results.

With the possible exceptions of selling/buying a website and applying to and ad service, Alexa serves no useful purpose and important decisions should not be made based on its results.

If you want to improve your ranking in Alexa just install the toolbar into your browser. Be sure to visit your site daily. This will cause your site to jump in the rankings after a few weeks. Get your friends to do it, too, and you can make a significant impact on your rankings.

What would be a good SEO strategy?

Before you write one line of code:

  • Do keyword research to determine what keywords you want to target.

While constructing your website you should do the following:

  • Use markup to indicate the content of your site
  • Optimize your <title> tags on each page to contain 1 - 3 keywords
  • Create unique Meta Tags for each page
  • Use header tags appropriately (H1 > H2 > H3)
  • Use <b> and <i> tags if appropriate
  • Optimize your URLs
  • Use Search Engine Friendly URLs
  • Use keywords in your domain (http://www.keyword1.com/)
  • Use keywords in your URL (http://www.example.com/keyword2/keyword3.html)
  • Use dashes or underscores to separate words in your URLs (keyword2-keyword3.html)
  • Optimize your content
  • Use keywords liberally yet appropriately throughout each page
  • Have unique content
  • Have quality content
  • Use search engine friendly design
  • Create a human sitemap
  • Do not use inaccessible site navigation (JavaScript or Flash menus)
  • Minimize outbound links
  • Keep your pages under 100K in size
  • Keep the number of links on a page to less then 100
  • Design the navigational structure of the site to channel PR to main pages (especially the homepage)
  • Create a page that encourages webmasters to link to your site
  • Provide them the relevant HTML to create their link to you (make sure the anchor text contains keywords)
  • Provide them with any images you may want them to use (although text links are better)
  • Make sure your website is complete before launching it

Immediately after launching your site you should do the following:

If you will pay to promote your website:

  • Submit your site to pay directories
  • Yahoo
  • GoGuides

Finally, as part of an ongoing strategy:

  • Continually update your website with quality, unique content
  • Continually seek free links preferably from sites in your genre

Do NOT do the following:

  • Make an all Flash website (without an HTML alternative)
  • Use JavaScript or Flash for navigation
  • Spam other websites for incoming links
  • Launch your site before it is done
  • Use duplicate content
  • Do not point several domains to one site without using a 301 redirect
  • Do not make a site of duplicated content from other websites
  • Use markup inappropriately
  • Style <H>eader tags to look like regular text
  • Hide content using ‘display: hidden’ (for the sake of hiding text)
  • Use other “black hat” techniques (unless you accept the risk - Banning from the SERPs)
  • Doorway/Landing pages
  • Cloaking
  • Hidden text

Additional Tips:

  • Usable and accessible sites tend to be search engine friendly by their very nature
  • Be patient! High rankings don’t happen overnight
  • Don’t obsess with any one search engine. They are all worth your attention.

Good SitePoint Articles that you might find useful:

[article=search-engine-friendly-urls]Search Engine Friendly URLs[/article]
[article=top-10-google-myths-revealed]Top Ten Google Myths Revealed[/article]
[article=indexing-limits-where-bots-stop]Search Engine Indexing Limits: Where Do the Bots Stop?[/article]

Google based tools

Google Webmaster Tools

Google Keyword Tool
Provided free by Google AdWords. Shows basic search volumes and related terms.

Google Suggest
As you type, Google will offer suggestions. Good related keywood search.

Google Trends
provides useful insights into broad search patterns across the world.

Google Zeitgeist
Weekly Google Search patterns and trends.

SEO Book Google Suggest Scraper Tool
Scrapes Keyword Suggestions from Google Suggest.

Yahoo! based tools

Yahoo! Site Explorer

Overture/Yahoo! Keyword Suggestion Tool
Official Overture Keyword Selector Tool.

Yahoo! Buzz
Statistics of Top Searched Terms on Yahoo! by Category.

Overture SEO Book Keyword Suggestion Tool
Scrapes the Overture Suggestion Tool but includes much more useful information. You can also target by country.

DigitalPoint Keyword Suggestion Tool
Used Suggestion Tool and Wordtracker and compares the two results.

MSN based tools

MS AdLabs Search Funnels
You can use the adCenter search funnel tool to help you visualize how people search by entering related keywords in certain sequences and analyze these search behaviors.

Keyword research tools

Trellian Free Keyword Discovery
Another good, free keyword tool. Also offers advanced features on subscription.

Free wordtracker Keyword Suggestion Tool
generates up to 100 free, related keywords and an estimate of their daily search volume.

Keyword Suggestions by CheckRankings.com
shows number of searches, competitors and competing AdWords in Google. Also provides a free ranking monitoring tool.

Lycos Top 50
Top 50 keyword list from Lycos.

Nichebot Classic
A 3 in 1 keyword suggestion tool: keyword discovery, overture and wordtracker.

Find exactly which competitors there are in your niche.

GoLexa Search Tool
The Search Tool with Complete Page Analysis for each Result and much more.

Keyword Lizzard
By Google AdWords Expert.

Ontology Finder
Related Keywords Lookup Tool by goRank.com.

Link research tools

Marketleap Link Popularity Checker
Backlink Anchor Text Checker

Other Tools

Webmaster Eyes


SEO Guide
SEO Glossary
Beginner’s Guide to Search Engine Optimization
Tips For New SEOs
Do It Yourself SEO - Part One
Do It Yourself SEO - Part Two

Very nice. Hopefully it will be stickied so it will serve as a good resource for everybody.

Another website you could add that checks PR is www.pagerank.net.

Good work! Should be a good resource for many people!

I think www.digitalpoint.com deserves a mention somewhere as a SERP and PR checker.

Well done!

Thank you stymiee for the useful links!

Added, thanks!

You’re very right. That does deserve mention.

When is Google going to do its next update?

I don’t know. He doesn’t know. She doesn’t know. Nobody knows. So please stop asking!

The best answer for that question i have ever heard… :slight_smile:

How do I get a lot of backlinks to point to my site?

Johnny , You could add linkingmatters.com as a resource for it. Excellent stuff

Looks like a commercial site to me. Trying to keep the FAQ as “free” as possible.

Nah. The report and workbook are absolutely free and one of the best piece of work I have seen. Infact they were recommended thrice by Larry Chase ( WDFM.com ).

Updated: added links to some SEO software websites. Seems to be a coomon topic.

Updated: added http://www.isedb.com

Thread stickied :slight_smile:


How long did that thread take to write out? :eek:
It’s very usefull, so i shake your hand on congrats :tup: shakes stymiee’s hand.

It only took about 30 minutes or so. Most of what’s there has been discussed so many times the answers were pretty much on the top of my head! :slight_smile: I spent most of my time just verifying a few facts and assembling the appropriate links.

Thank you for the link to the PR Extension for Firefox! That was my only reason for ever using IE to actually browse… to check the PR of a site. Now I can do it in a much less obtrusive manner in Firefox!! Just one MORE powerpoint for Firefox. The King of Browsers!

Nice job, keep it up. But stymie, I read in an article that adding pages to search engines are not really necessary. They suggested exchanging links with already indexed sites. This enables the spidering software to index your web site too because your link is in the site that has already been indexed. It therefore indexes your site as well. How about that?

I would add another FAQ question myself.

My site has been live for <some number less than 6> weeks. Why haven’t any of the search engines picked it up?

Answer: It takes a good 6 to 8 weeks to get picked up by the search engines, and maybe longer if your site has say, thousands of pages.

If you choose to include this in the FAQ, feel free to change the wording as you like it. :slight_smile:

I would add this article:

It covers alot of the points you make in greater detail.

Added along with your article about search engine friendly urls.

I think i have to mention Aspen’s site.
It’s my refference.