When Google Spider come to my website for crawl content at that time i want some content not be crawl by Google crawler.
Is this for an actual website or social media page (you posted in the social media forum)?
yes , it’s a actual website
You have these pages as disallow in the robots.txt?
You can use a robots.txt
file to disallow crawlers from files or directories.
Or use the robots meta tag to disallow a page on the site.
And keep in mind that the instructions in robots.txt are your suggestions to the crawlers. Most search engine crawlers follow those suggestions, but of course there is no reason they have to, so if your goal is to stop Google crawling in specific, that’s good, but if it’s more of a security issue, the answer is to put the content behind some sort of authentication wall, probably.
You can do it by two way:-
- By Configuring the robots.txt file
or - You can use a meta tag inside the head section of particular page
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.