I want to hide a part (actually a content type) of my drupal site. I'm using clean urls and url alias and read about how to do it with robots.txt
But now I'm a little bit confused. I have the following example page: example.com/nosearch/test. Now I want to hide all pages /nosearch/, i do this like this:
Everythins fine. But now I found out that I can reach "example.com/nosearch/test" also with "example.com/node/233" (if I know the node number of course). So do I need to put this also in my robots.txt file? Like "Disallow: /node/233"?