Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
I'm using the robotstxt module and the robots.txt couldn't be found if i enable the fast 404 module. Can you fix this?
Comment | File | Size | Author |
---|---|---|---|
#9 | README.TXT_.patch | 713 bytes | dnmurray |
Comments
Comment #1
soyarma CreditAttribution: soyarma commentedWhat you'll want to do is alter the regex that checks file extensions. The default is:
change it to:
It will then not match to robots.txt
Comment #2
coveryoureyes CreditAttribution: coveryoureyes commented..you mean to:
$conf['fast_404_exts'] = '/[^robots]\.(txt|png|gif|jpe?g|css|js|ico|swf|flv|cgi|bat|pl|dll|exe|asp)$/i';
Comment #3
soyarma CreditAttribution: soyarma commentedYes, sorry about that missing /
Comment #4
soyarma CreditAttribution: soyarma commentedI'm debating making this a default... Thoughts?
Comment #5
soyarma CreditAttribution: soyarma commentedThis has been made a default
Comment #6
Geldora CreditAttribution: Geldora commentedActually, this was not set to default...
I've just installed the last version and I still have a problem with robotstxt module. I need to use settings.php to be able to exclude robots..txt path.
Please consider to make default exception for robots.txt
Comment #7
dnmurray CreditAttribution: dnmurray commentedThis is wrong:
$conf['fast_404_exts'] = '[^robots]\.(txt|png|gif|jpe?g|css|js|ico|swf|flv|cgi|bat|pl|dll|exe|asp)$/i';
That's going to match any file that ends in r, o, b, t, s followed by . and one of the the extensions in the list. I'm getting an apache 404 for nosuchfile.txt instead of the fast404 response. nosuchfiles.txt gets me the fast404 response.
Comment #8
dnmurray CreditAttribution: dnmurray commentedI think this is what you want, and you should update the readme.txt
$conf['fast_404_exts'] = '/(?<!robots)\.(txt|png|gif|jpe?g|css|js|ico|swf|flv|cgi|bat|pl|dll|exe|asp)$/i';
unfortunately, this will also match any-prefix-robots.txt, which we may not want.
Comment #9
dnmurray CreditAttribution: dnmurray commentedpatch for above
Comment #10
Anonymous (not verified) CreditAttribution: Anonymous commentedsubscribe - same issue with 7.x latest version
Comment #11
soyarma CreditAttribution: soyarma commentedI actually realized this and corrected it as well when I was providing regex for private files.
However, I used (?!robots) and it seems to work A OK