Following up this closed issue https://www.drupal.org/node/1260912

I have a solution for D8.

We can use composer scripts to remove the file.

CommentFileSizeAuthor
#2 removing_robotstxt-2832437-1.patch948 bytesmarcelovani

Comments

marcelovani created an issue. See original summary.

marcelovani’s picture

Status: Active » Needs review
StatusFileSize
new948 bytes

Adding the patch to README.txt with the instructions

To test this, edit the composer.json used to build the site and add the script section.

Then run
composer install
or
composer update

ps: change the location of the file as per your setup. I use a folder called web.

  • hass committed 474c8b6 on 8.x-1.x authored by marcelovani
    Issue #2832437 by marcelovani: Automatically removing robots.txt...
hass’s picture

Status: Needs review » Fixed

  • hass committed cbb95bb on 8.x-1.x
    Issue #2832437 by hass: Remove confusing subfolder
    
marcelovani’s picture

Assigned: marcelovani » Unassigned

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.

steveoriol’s picture

Maybe we can add another command to make sure that the robotstxt module is installed on all instance sites, like a "drush @sites pm-enable robotstxt -y" or with the drupal console ...

I changed the command to "rm -f web/robots.txt" to do not have error message if the robots.txt is already deleted.

nishtha.pradhan’s picture

In order to automatically remove robots.txt file and ensure that it stays removed, the following entry can be made in composer.json:

"extra": {
        "drupal-scaffold": {
            "file-mapping": {
                "[web-root]/robots.txt": {
                    "mode": "skip"
                }
            }
        },
-----------other lines of codes--------------- 
    }

This will ensure that robots.txt will be skipped when composer install or composer update is run.

papa ours’s picture

That's the most elegant way, thank you!

chike’s picture

As per #9

This didn't work for me.

            "file-mapping": {
                "public_html/robots.txt": {
                    "mode": "skip"
                }
            }

The file still got scaffolded.

This worked.
,

    "scripts": {
      "post-install-cmd": [
        "rm public_html/robots.txt"
      ],
      "post-update-cmd": [
        "rm public_html/robots.txt"
      ]
    }

The file got removed.

chike’s picture

Well. Not until I moved to the live server. On the live server, the script equally didn't work for me.

savage1974’s picture

Hi, all.

Try this:

      "file-mapping": {
           "[web-root]/robots.txt": false
      }

This helped me solve the problem.

best regards

chike’s picture

Thanks @savage1974

#13 works.