The robots.txt file is a text file placed at the root of your website that instructs search engine bots which URLs they are allowed or disallowed to crawl. While not mandatory, it’s a powerful tool for controlling crawler behavior and protecting sensitive or irrelevant pages.