Closed (fixed)
Project:
RobotsTxt
Version:
8.x-1.x-dev
Component:
Documentation
Priority:
Normal
Category:
Task
Assigned:
Unassigned
Reporter:
Created:
2 Dec 2016 at 12:37 UTC
Updated:
5 Sep 2025 at 15:03 UTC
Jump to comment: Most recent, Most recent file
Comments
Comment #2
marcelovaniAdding the patch to README.txt with the instructions
To test this, edit the composer.json used to build the site and add the script section.
Then run
composer installor
composer updateps: change the location of the file as per your setup. I use a folder called web.
Comment #4
hass commentedComment #6
marcelovaniComment #8
steveoriolMaybe we can add another command to make sure that the robotstxt module is installed on all instance sites, like a "drush @sites pm-enable robotstxt -y" or with the drupal console ...
I changed the command to "rm -f web/robots.txt" to do not have error message if the robots.txt is already deleted.
Comment #9
nishtha.pradhan commentedIn order to automatically remove robots.txt file and ensure that it stays removed, the following entry can be made in composer.json:
This will ensure that robots.txt will be skipped when
composer installorcomposer updateis run.Comment #10
papa ours commentedThat's the most elegant way, thank you!
Comment #11
chikeAs per #9
This didn't work for me.
The file still got scaffolded.
This worked.
,
The file got removed.
Comment #12
chikeWell. Not until I moved to the live server. On the live server, the script equally didn't work for me.
Comment #13
savage1974 commentedHi, all.
Try this:
This helped me solve the problem.
best regards
Comment #14
chikeThanks @savage1974
#13 works.