A robots.txt file is used at the root of your site that indicated parts of your site that you don't want accessed by search engine crawlers. Robots.txt should only be used on web pages to control crawling traffic, normally because your server cant handle Google's crawler or to not waste crawl budget on unimportant or similar pages. Do not use robots.txt to hide your webpage from Google search results it may not work all the time.