Active
Project:
PHPIDS
Version:
6.x-2.x-dev
Component:
Code
Priority:
Minor
Category:
Bug report
Assigned:
Unassigned
Reporter:
Created:
8 Jan 2013 at 02:26 UTC
Updated:
8 Jan 2013 at 02:26 UTC
I run drupal 6.27 including phpids.
However, status report claims to add 2 disallow: entries to robots.txt
The robots.txt checked is the one in drupal base directory.
I run drupal within a subirectory and therefore robots.txt has to be in the root-directory of the webserver, not in drupal base directory which is invoked by a alias config of apache.
Maybe it makes sense to handle this condition.
workaround is to add the requested disallow: entries in both, the real used robots.txt and also the one in the base directory of drupal even this robot.txt is never checked by webcrawlers, only to make the PHPIDS-module happy.