You might want to disallow Google to index some of your pages from the search engine, here is something how to do that. For full guide, click here.
- Make a text file with name “robots.txt”
- Now, use the parameters as per the requirement such as ‘user-agent’, ‘disallow’ etc.
- ‘user-agent’ is the name of the search engine/ crawler the rule applies and ‘disallow’ is the URL you plan to block.
- Type the parameters into the text file and it’s done.
- Save it in the root directory of your domain. Example: www.domainname.com/robots.txt