I am implementing a blog on a real estate website.
The realtor already writes articles which appear in two places - (1) once a month for a local newspaper which only has content inside a downloadable PDF and (2) in our local city’s patch.com.
Will search engines, in particular Google penalize the realtor’s website because the article content is also present in two other places - the Patch.com HTML and the newspaper’s PDF?
Should he discontinue his newspaper and/or patch.com submissions?
Or should the text be different in the different places, and if so by how much? Do sentences/words need to be moved about, or can just synonyms of words be substituted?
Ideally the realtor would have the other 2 sites install a rel canonical meta tag on the other 2 sites article page pointing back to his article so the search engines know the one on his site should receive the credit. Most likely they will not do this though, so I would make sure the realtor published the article on their site first and that it gets indexed before it is published on the other 2 sites. If they do that then they will have the best chance of Google recognizing the article on their site as the original. There is no guarantee of this happening, but this approach gives them the best chance.
Search engines don’t like multiple domains serving same content.
The copied content on the new site might affect your first site’s rankings, knowing that the new site will be eventually indexed by major search engines. Google will filter results at serve time, collapsing the duplicates and then show the most relevant page in the SERPs for the query.
Thanks. This must happen a lot - for example say websites scraping content from cnn.com - in such an instance does the sheer popularity of cnn.com override any penalty they might get from Google for having the same content as the scraper site?
I see so many website with the same content on them these days, I haven’t nocticed any penalization. Word around the web is though you can get penalized. I wouldn’t worry about it.