I run drupal 6.27 including phpids.

However, status report claims to add 2 disallow: entries to robots.txt
The robots.txt checked is the one in drupal base directory.

I run drupal within a subirectory and therefore robots.txt has to be in the root-directory of the webserver, not in drupal base directory which is invoked by a alias config of apache.
Maybe it makes sense to handle this condition.

workaround is to add the requested disallow: entries in both, the real used robots.txt and also the one in the base directory of drupal even this robot.txt is never checked by webcrawlers, only to make the PHPIDS-module happy.