Crawler not show content?

When Google Spider come to my website for crawl content at that time i want some content not be crawl by Google crawler.

1 Like

Is this for an actual website or social media page (you posted in the social media forum)?

yes , it’s a actual website

You have these pages as disallow in the robots.txt?

You can use a robots.txt file to disallow crawlers from files or directories.

Or use the robots meta tag to disallow a page on the site.

1 Like

And keep in mind that the instructions in robots.txt are your suggestions to the crawlers. Most search engine crawlers follow those suggestions, but of course there is no reason they have to, so if your goal is to stop Google crawling in specific, that’s good, but if it’s more of a security issue, the answer is to put the content behind some sort of authentication wall, probably.

2 Likes

You can do it by two way:-

  1. By Configuring the robots.txt file
    or
  2. You can use a meta tag inside the head section of particular page

            
2 Likes

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.