How to Make a Div content hide in a webpage from search engines?

I have a website, I want to hide a div content in a web page from search engines, But I am unable to do this. Please suggest me to hide div content only from search engines and make visible to users.

I don’t know why you want to do this. In the past the content could be rendered by javascript, but I think search engines are smarter now and can read js generated content.
I think the options are to remove the content from the page altogether, or block the entire page with a robots meta tag.
Another way, which I would not condone, would be to display the content as an image, this would of course be an accessibility fail and probably no0t work well responsively, not a good idea.
To be honest, this all sounds a bit shady. If you are wanting to so such things for SEO purposes, you have to be doing something very wrong.

1 Like

This is something you want to not do.

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected


Actually, I have parent website and more than 10 its child website for different products. But the main thing is that all websites are having same terms and condition, privacy and policy and more this type of things. Now, I hope you clear about this.

Could you tell me, I have a parent website and more than 10 its child website with having same, terms and condition, privacy and policy, refund policy and more as these, then how to manage these all???

This is a different topic?

If all have the same, isn’t pretty much a do once and update only rarely?

You can block the entire page from being indexed by adding

<meta name="robots" content="noindex"> or <meta name="robots" content="noindex, nofollow"> in the head.

See and for more information.

Is this a concern about being penalised for duplicate content?
You should not worry too much about that if that content is only things like Ts & Cs, Etc. That is not likely the content that people are actively searching for with keywords.
Duplicate content within a site will generally just mean that just one result is returned for searches matching that content with the others ignored. But the unique parts of those pages should still show for relevant searches for that content.
For example, if you have 100 product pages, all have the same boilerplate privacy policy in the footer. Someone searches Google for “Privacy Policy”, just one result for your site may appear in serps, because the rest is repetition of the same content. The one page that appears may be a page about Digital Radios, but all your pages about Smart TVs were ignored. That does not mean pages about Smart TVs are not indexed, the y just don’t rank for that search term. If the user makes another search, which is more specific and relevant to the product like “Smart TV”, then the Smart TV page may appear in serps, regardless of the duplicate content in the footer, because that content is not relevant to the search term.
However, if the exact same policies relate so many pages, it may be better to have just one page for that and have each product page link to it.

More info here:-


suppose, I have a website (online pharmacy) and made 10 websites with different medicine with having T&C, P&P same, then…

Please read my question very clearly, I want to hide only part of content of my entire page.

No, suppose i have a parent domain ( and its child domains as (,, etc…) all child domain are having same T&c and P&P etc.

Sorry - I was assuming when you said [quote=“a1webservice, post:5, topic:257809”]
terms and condition, privacy and policy, refund policy

that you simply linked from each page to these pages, which is, in my experience, the more usual method. That would certainly be an easy way around your problem.

If i do this Google will penanlize my website???

If you do what?

Google understands that certain pages, like T&C or privacy policy, are common to many sites. As @SamA74 has explained, you will not be penalised for having the same information on multiple pages, but is it necessary to do that? Might it not be rather irritating to your visitors to see exactly the same text on every page? If you have a single page on each site which you link to from every page, visitors can easily find the information if they wish. You can use meta noindex to prevent these pages being indexed, if you wish.

Thanks, I will consider your point and will see

There is no penalty. What there is, is no guarantee of which duplicate content will appear where in the search results.

For example, Discourse forums have a FAQ page.

If you search for the first paragraph there are many results

I hope so…

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.