Best SEO strategy for old sites and new websites

Hello,

I have old sites (2+ years old) and new websites (1 year old).

What is the good SEO strategy to get indexed on these sites?

What I learned from this forum is to write articles and submit them to social bookmarking sites.

How often do I need to submit them to social bookmarking sites for new sites/old sites?

Thanks a lot.

Have a nice day!

If that is all you have learnt, or is the best of what you have learnt here on SitePoint then we’re not doing a very good job :frowning:

Social Bookmarking is about engaging with your online audience, allowing them to, independently of your actions, spread the word of your site/content. Social Media Marketing is NOT a SEO technique - it is a marketing one.

If you want to encourage better indexation of your content then:

  • Get back-links from high-PR web-pages (note that I said web-pages - not websites - there’s a difference). PR doesn’t help ranking so much but has been stated by Google that it does affect rate and depth of indexation.
  • Provide an XML site-map of your website content. Ensure that this site-map is addressed by your robots.txt file.
  • Provide a HTML version of your site-map to allow discoverable crawl.
  • Deep-link to your content from your top-level pages.

Deep-link to your content from your top-level pages.

Does it mean?

  1. The homepage should have a link to a 2nd page.
  2. the 2nd page should have a link to a 3rd page.
  3. the 3rd page should have a link to a 4rd page
  4. and so on.

am i right?

There’s no set rule. And if there was one I wouldn’t follow it :wink:

Deep-linking normally refers to linking to your bottom-level pages (normally product/article pages) from your home-page or second-level page.

I’d probably use the same strategy on both of them. I am a big article marketing person so that’s what I mostly focus on as far as link building. But I also like to add other techniques (video marketing, social bookmarking, “link baiting,” blog commenting) to the pot so I’m not relying entirely on any one technique.

You should build your links structure based on what works best for your users. The rule of thumb I use is that nowhere should be more than 3 clicks away from anywhere else in a website.

Since a link normally tells a user something about the page it links to, it will do the same job for the search engine bot. You can get seriously tricky with internal link structures, some people charge a lot of money for it but I think it’s money for old rope. You can’t funnel or ‘silo’ PR and links should be relevant to their destination anyway, and the positon on-page of links will reflect their importance naturally… etc etc

Seriously, if you build your links as if search engines didn’t even exist, you’d probably be doing it exactly right.

I agree - deep-linking isn’t an SEO technique in of itself. Think of occasions when sites provide links direct to popular products or articles from the home-page. This aids people browsing/searching as well as search engines finding that content without wading through nested categories etc.

Well since the ‘mean old man’ is back I’m going to ignore the other answers and give it to you straight.

Here are the rules (and I have proven it 10000 times)

  1. A clean, light WELL CODED site (the less code / content ratio the better)
  2. Proper use of title, heading and alt tags … PROPER use of title attributes on links
  3. proper keyword usage of linked words
  4. CLEAN SEO friendly URLS that have the important keywords for the page
  5. An understanding that you do NOT ‘optimize a site’, you optimize each page on a site for it’s unique keywords / terms
  6. Fresh content on a regular basis

Once all of that is done and your site is PERFECT then you can do the following

  1. Links Links and more links
  2. More fresh content that is on topic for your site
  3. If applicable some social networking

and if you still have some time on your hands THEN maybe write an article or two

But until you have taken every hour of every day and made damn sure you site is perfect in SEO terms you are wasting your time

Hi.

Agreed with everything up till that bit. 95% of the index doesn’t validate and I’ve seen some totally putrid pages rank number highly for competitive phrases.

I think your list is a worthy goal and we should be doing that even if search engines didn’t exist if only for the sake of Accessiblilty and cross browser compatibility, AND, it’s what Google want us to do for exactly those reasons. It’s not entirely necessary though.

Didn’t say anything about validating, I said perfect SEO wise, even though I personally will make sure everything is valid but you are right it’s not required for SEO

Sorry, I’ll ellaborate. I didn’t mean that pages have to WC3 validate, I just meant that since 95% of pages don’t validate they’re probably not coded that well, which IS relevant to what you were saying, but doesn’t seem to stop so mnay dogs dinner pages from ranking well.

You’re probably well aware that the most important signals, after relevance, occur off-page. So, still don’t agree with that last sentence of yours, pages that are far form perfect in ‘seo terms’ still rank well…