Search Engines are using robots (or so called User-Agents) to crawl your pages. The robots.txt file is a text file that defines which parts of a domain can be crawled by a robot. In addition, the robots.txt file can include a link to the XML-sitemap.
Caution: this option allows every bot to crawl every single page of your website.
Click customize from here to set additional rules.
Get more traffic and customers by optimizing your website, content and search performance. What are you waiting for?