How to block a web page?

Hello Friends

I want to block a web page of my website by robots.txt… Can you please tell me that how can I block it.

It’s very simple. :slight_smile:

User-agent: *
Disallow: /pageurl.html

You’ll find helpful examples here, which cover most situations:

There are two ways to prevent search engine robots to crawl the web site or particular web pages. One is which TechnoBear has wrote above and the other is you can use noindex Meta tag. As long as a Google bot fetches the page, but when robots see this Meta then they stop crawling and showing up it in the web index.

Along with these 2 techniques, if your webpage is already got cached and index, then you can Demote that link from search results with the help of “Google Webmaster tools” in the “Sitelinks” section.

The question has been answered, and as the OP has not returned it appears they are satisfied with the answer. There is no need to keep repeating it.

Thread closed.