To disallow all bots from indexing, searching, or accessing any part of your website, you can create a
robots.txt file with the following content:
This robots.txt file tells all bots that they are not allowed to crawl any part of your website. The User-agent: * directive applies to all bots, while the Disallow: / directive specifies that all pages and directories should be disallowed.
Make sure to save the robots.txt file in the root directory of your website so that it can be easily found by bots. Keep in mind that some bots may still ignore the robots.txt file, so you may also want to consider other methods of protecting your website from unwanted access.