Why the robots.txt file is important?
The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. Let’s say a search engine is about to visit a site. Before it visits the target page, it will check the robots.txt for instructions.