I have been running a few old sites (3+ years old) and new sites (less than 1 year old). You can’t apply the same SEO tips for old and new sites.
They don’t work the same. I mean, I can submit articles or new pages to social bookmarking sites 5 times a week. However, I can’t do the same with new sites.
If I submit my new sites to social bookmarking sites too often, I will get reindexed. If you want to get indexed faster, then don’t try to get many back links for new sites.
In other words, I can’t get many backlinks too often for new sites.
Why not? Everything else being equal the older the site the more trusted it is in the eyes of the search engine. That’s about the only time the age of the site makes any difference.
i think google crawls your site whether you have too many or just few links
google crawls by decending pR (backlink number and inmportance)
to if you have like 10000 links do you think google wont index you ?
Fast backlinking in of itself isn’t a search engine red flag - it’s the nature of that sudden accumulation - where they come from, whether they look bought or automated or spammy.
Things go viral every-day on the web and I have seen pages accumulate several thousand backlinks overnight without any negative impact (in fact the opposite).
It’s simple maths - if a low-quality page gets thousands of low-quality back-links then the result is likely to be low-quality.
I think something new you can try but within that you need to continue your work as SEO is a very continuous and long process. For example, if your site is indexed with 1 Press release, then you will not do it again through making the unique content of the same keyword on same site.
Domain age is the one of the criterion that google take in to consideration while ranking site…but i dont think there is any difference as far as SEO is concerned for new or old site
Like people clicking on adsense to kill their competitors.
Google is idiot and unhonest not to create a system which prevent crooks from abusing honest webmasters because crooks serve Google though they pretend to do everything to make their system honest they do not do the BEST things.
There are techniques called SPC (Statistical Process Control) which are well known in Industries (I’ve been a statistical quality engineer so I know what I’m talking about) which can be used to detect cheaters so why don’t they use it instead of arbitrarilly penalizing the honest webmaster and rewarding the cheater competitor who tried to kill the other adsense accounts or blogs ?