How to block a URL with robots.txt?

You might want to disallow Google to index some of your pages from the search engine, here is something how to do that. For full guide, click here.

  1. Make a text file with name “robots.txt”
  2. Now, use the parameters as per the requirement such as ‘user-agent’, ‘disallow’ etc.
  3. ‘user-agent’ is the name of the search engine/ crawler the rule applies and ‘disallow’ is the URL you plan to block.
  4. Type the parameters into the text file and it’s done.
  5. Save it in the root directory of your domain. Example:


User-agent: *
Disallow: /