A robots.txt file is a text document that can be placed on a website's server. It determines whether and when search crawlers can visit the site's subpages and include them in their index. This allows certain subpages to be excluded from search results.
Example: Using a robots.txt file allows you to exclude site archives from search results. However, some search engines may ignore robots.txt files. If you really want to hide a subpage from search results, you should password-protect it.