A robots.txt file is not really an correct or helpful technique for blocking sensitive or private material. It only instructs well-behaved crawlers that the pages aren't for them, but it doesn't prevent your server from providing These pages to some browser that requests them. 1 purpose is the fact that serps could even now reference the URLs you b