SitePoint Sponsor |
|
User Tag List
Results 1 to 14 of 14
Thread: A problem with duplicate content
-
Dec 14, 2007, 14:59 #1
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
A problem with duplicate content
Here's a situation.
Site A is a more general site, Site B is more specific but it in the same niche. I own both sites.
Site A and Site B both contain technical information, however, site A is very detailed, while Site B is less so.
I am worrying about duplicate content, as although Site B has a shorter version of the information, the SEs might still see it as duplicate content.
I can think of two options:
1. Put a robots.txt on my site to block the information pages on Site B, (question: do I have to use rel="nofollow" for backlinks from external sites to the technical information pages?)
2. Completely change the information in Site B, which might be troublesome as there would be a lot of information to change. (question: to what extent should I rewrite the information so that it won't be seen as duplicate content?)
Which would you suggest? What are other (better) options?Last edited by lee_vhoi_ol; Dec 14, 2007 at 15:36.
-
Dec 14, 2007, 15:06 #2
- Join Date
- Mar 2007
- Posts
- 1,208
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
The robot txt should stop google from listing it. I would make new content but you can use the robot txt
-
Dec 14, 2007, 15:25 #3
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Yes, I think making new content is the best thing too. But in my case, it's hard to do, with lots of material to rewrite, and these are technical reference material to speak of.
But if ever I decide to rewrite the content on Site B, what are the tips you could give? To what extent should I make the changes that it won't be seen as duplicate content??
-
Dec 17, 2007, 02:26 #4
- Join Date
- Nov 2007
- Posts
- 472
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
me suggest to change the content. subject is same not good. Better read site A and B then take the full idea from both then just type the content by yourself . So it will be different . Copy paste will be duplicate . But in the case of typing the content will be different.
-
Dec 17, 2007, 03:00 #5
- Join Date
- Dec 2004
- Location
- Derbyshire - UK
- Posts
- 2,651
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
What is the purpose of having two sites? You say that one is more generic while the other is specialized but why not combine the two?
At the moment you're spending time and effort maintaining two sites whilst inbound links and any marketing effort will be split between the two sites.
If all the content is related then I'd personally look at combining the two sites into one resource where people can come for the basic information and the specialist content unless there's really a good reason for having two separate sites?
-
Dec 17, 2007, 03:06 #6
- Join Date
- Jun 2007
- Location
- Cape Town
- Posts
- 233
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
A robot.txt file isn't going to block search engines from indexing SiteB if someone links to pages on SiteB. It would be better to add a robots metatag in the header of each page you want to block.
-
Dec 17, 2007, 04:22 #7
It would be better to change your content on site B for better SERPs.
Best of luck!Website design|| Looking forSearch Engine Marketing
-
Dec 17, 2007, 04:29 #8
- Join Date
- Dec 2007
- Location
- UK Nr Manchester
- Posts
- 3,460
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
-
Dec 17, 2007, 09:22 #9
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
The situation is that the sites are both generating profit. That's why the more specialized site shouldn't be shut down. Also, the specialized site is a subset of the general site, however, it is not a proper subset. That is, the specialized site also contains some contents not found in the more general site.
-
Dec 17, 2007, 09:36 #10
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
-
Dec 17, 2007, 09:45 #11
- Join Date
- Feb 2003
- Location
- Slave I
- Posts
- 23,424
- Mentioned
- 2 Post(s)
- Tagged
- 1 Thread(s)
-
Dec 17, 2007, 14:57 #12
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Thanks stymiee.
-
Dec 17, 2007, 15:27 #13
- Join Date
- Nov 2007
- Location
- California
- Posts
- 357
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
ok, so how much should I change so that it won't pass as duplicate content? Is it enough to change some words, without much restructuring?
-
Dec 17, 2007, 16:44 #14
- Join Date
- Dec 2007
- Location
- UK Nr Manchester
- Posts
- 3,460
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Dare I suggest that you go for a different keyword density?
Target a different keyword phrase and re-write the copy. Use it in all the right places, get links with that keyword phrase in the anchor test.
If it's still regarded as duplicate content by Google (and by you) then perhaps you should merge the two sites because you won't be considered as offering something useful to the user.
Don't try to con Google into thinking it's different content, that way lies the dark sideProvide two quality sites.
Bookmarks