Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
I keep getting migration failed.
Could not back up sites directory for drupal <----- This is where it fails
Returned from hook drush_provision_drupal_provision_backup
Removed stale backup file /var/aegir/backups/oldsite-20150728.204529.tar.gz <---- yet is is still able to remove the backup:
I am totally confused on this one.
Comment | File | Size | Author |
---|---|---|---|
#24 | 0001-Stop-checking-ownership-on-private-temp.patch | 1.63 KB | memtkmcc |
#18 | provision-backup_sites_without_temp-2542236-18.patch | 3.04 KB | colan |
#18 | interdiff-2542236-15-18.diff | 1.33 KB | colan |
Comments
Comment #1
mshepherd CreditAttribution: mshepherd as a volunteer commentedI'm seeing this same problem with a site I've imported from a previous Aegir version (version 1.x >> 3.x). I can neither backup now migrate the site.
It seems likely this is a permissions issue, but I can't figure it out just yet.
Comment #2
mshepherd CreditAttribution: mshepherd as a volunteer commentedI solved my issue.
There were a number of files in .../private/temp with www-data:www-data ownership and 600 permissions. These files weren't present in the imported site so they must have been created after import. Changing permission on the files to 660 meant that I could backup and migrate the site.
Comment #3
mshepherd CreditAttribution: mshepherd as a volunteer commentedI note that after the site was migrated, the files in that directory had aegir:www-data ownership and 660 permissions.
Comment #4
ergonlogicThe issue described in #2, which is by far the most common cause of backups failing, is due to a bug in Drupal core; see #2496173: file_unmanaged_save_data() doesn't clean up its temp files. The patches in that issue (for both D7 and D8) resolve this one, but must be applied to hosted platforms. Consider adding them to your makefiles.
FWIW, the backup task failure in such cases is intentional. Previously, errors in tarring up site files would be masked by a pipe in our system call. While this would be relatively harmless in the case of broken file permissions (it'd still block platforms from being deleted), it could also lead to corrupt backups.
Let us know if this fixes the issue for you.
Comment #5
mshepherd CreditAttribution: mshepherd as a volunteer commentedThanks,
I'll need to migrate several more sites in the coming days or perhaps weeks. I should a chance to test this out. Many thanks.
Matthew
Comment #6
ergonlogic@SocialNicheGuru, can you let us know if changing the permissions/ownership of the site's files help here? Feel free to ping me in #aegir, if you'd like some help checking this out.
Comment #7
SocialNicheGuru CreditAttribution: SocialNicheGuru commentedI did a chown -R aegir files
There were a number of files that I could not change ownership on
For those files I did a sudo chmod 775
but it didn't work for my tmp directory.
I had to delete all files in the private/files/tmp directory on the private file system for it to work :(
Comment #8
dnotes CreditAttribution: dnotes commentedIn my case there were other files that were not readable or writable by the user. (I'm trying to use boa aegir on a local vagrant box with the static platforms folder shared via nfs and then fused into the file system with forced permissions, and it's a bit messy for file permissions.)
Anyhow, if it turns out that the files/tmp directory is not your problem, you may find other files in your sites folder using e.g.
find . ! -perm -u+r
, or for files that are not assigned to the correct user,find . ! -name o1
. Nice to know that this seems to be mainly a file ownership/permissions problem. Thanks for this issue.Comment #9
omega8cc CreditAttribution: omega8cc commentedWe have reverted (in BOA HEAD) the patch from #2377819: Gzipping backups suppresses file permissions errors which caused just too many pseudo-problems / support tickets for us.
EDIT: I have explained why the fix is too aggressive and causes more problems than solves in this comment.
Comment #10
MrAdamJohn CreditAttribution: MrAdamJohn commentedAs noted in #2 above,
chmod 660 [path]/private/temp
worked perfectly.Is there work being done to carry the underlying bug fix into provision? Ping me off thread if needed.
Thanks, @msheperd and @ergonlogic!
Comment #11
helmo CreditAttribution: helmo at Initfour websolutions commentedI think the catch all solution here will be in #2616426: Add 'fix permissions' task
Comment #12
colanI don't believe there's any reason to back up temporary files. We can avoid the permissions issues altogether if we simply exclude the temporary files directory from each backup.
This has the happy side effect of reducing the size of backups (even if the permissions are correct).
Can we not simply do something like this?
Comment #13
bgm CreditAttribution: bgm commented+1 on the patch. It's a bit annoying to 1) run backup, fail, 2) run verify, 3) run backup again, hoping no files were created in the mean time.
Comment #14
colanThere's some debate about where the exclude option should go. I went with the one that worked for me, but didn't want to commit this yet until we're fairly sure it'll work in the general case. Otherwise, it'll break backups in other places.
So please review and test on your own systems.
Comment #15
colanAfter letting automated backups run for a while, and inspecting the contents to ensure nothing was missing, I noticed that we're also archiving CSS and JS cache files as well as site-specific Git repositories. None of these should be included so I've added them to the exclusion list. Here's the patch for that.
For the site I was reviewing, the backup shrank from 27M to 9.3M. This makes sense because the additional exclusions are mostly already compressed so they can't be shrunk much more and don't see additional shrinkage (unlike the DB).
Comment #16
bgm CreditAttribution: bgm commentedIf we migrate a site from one platform to another, wouldn't this be equivalent to deleting the local git repo?
Request: Would it be possible to add "files/civicrm/templates_c" to the list? It's a smarty cache from CiviCRM. Admittedly it's not Drupal-specific, but there are many Aegir CiviCRM users, and it would really help as well.
Comment #17
colanGood catch re: Git repo. We need to remove that one.
For the Civi thing, it might make sense to add a hook here, and then modules can add their own exclusions. There will be more.
Comment #18
colanThis should account for both of the above items.
Comment #19
bgm CreditAttribution: bgm commentedLooks good, based on code review. Thanks for implementing the hook!
Comment #20
helmo CreditAttribution: helmo at Initfour websolutions commentedWorks as expected in a quick test.
Comment #21
colanMinor side effect: When cloning (and probably migrating), this is causing some warnings to show up because the
temp
directory is missing. This isn't actually a problem because it gets created later in the process. So maybe we should stop issuing these?For context, it happens right after this:
Comment #23
colanCommitted #18 as there were no reported problems a while after RTBC.
Setting back to Active for #21.
Comment #24
memtkmcc CreditAttribution: memtkmcc at Omega8.cc commentedThat's correct -- Since private/temp is no longer included in backups and only recreated later during migration, this check needs to be removed to not cause confusion due to (otherwise harmless) warning in the task log.
Comment #25
memtkmcc CreditAttribution: memtkmcc at Omega8.cc commentedComment #26
colanBut if it does exist, shouldn't the group still be changed?
Comment #27
memtkmcc CreditAttribution: memtkmcc at Omega8.cc commented1. It will not be included in any backup anymore, so not sure how it could exist there?
2. Its group will be changed a moment later in
function _provision_drupal_create_directories
anyway3. This check doesn't change anything, it's just a check, no longer relevant, at least no longer relevant directly after the archive is expanded and before
function _provision_drupal_create_directories
is run.EDIT for #3 -- This check doesn't change anything, because the directory doesn't exist at this point, so it's just a check for the directory existence, which obviously fails.
Comment #28
colanThat explanation works for me.
Comment #29
memtkmcc CreditAttribution: memtkmcc at Omega8.cc commentedI realised my point #3 was not clear enough, so added this:
EDIT for #3 -- This check doesn't change anything, because the directory doesn't exist at this point, so it's just a check for the directory existence, which obviously fails.
Comment #30
colanLooks like this was fixed in 4d870e1c2df3d204c38437e98a4624dccc037db5.