Please DO NOT push to this branch. See How to Help for instructions for working in the contrib module.
The latest work for Package Manager is happening in the Automatic Updates Contrib module. There is conversion process but it is not being run constantly so the contrib module is the place for now to review the latest code see #37(should we just postpone?)

#3319030: Drupal 10 Core Roadmap for Automatic Updates

Overview

Package Manager is an API-only module which provides the scaffolding and functionality needed for Drupal to make changes to its own running code base via Composer. It doesn't have a user interface.

The easy questions to answer: Why?

Why build this if not everyone can use it?

  • Two of the current strategic initiatives, Automatic Updates and Project Browser, are creating user interfaces to run Composer commands. Package Manager was created to facilitate this common need.
  • These two initiatives both have known restrictions, in that they will not be usable on all sites because of the system requirements. The main system requirements are that the codebase must be writable and the Composer executable must be available.

System requirements:

All of the following system requirements have corresponding Package Manager validators (see core/modules/package_manager/src/Validator for these and other validators) which help determine if the system is currently compatible with Package Manager. Modules that use Package Manager may display any validation errors in their respective user interface.

  1. The codebase must be writable by the web server. Because this is intrinsic to the purpose of this module and the two modules that depend on it, this requirement is unlikely to change in the future. Although this will prevent Package Manager from working on some hosting environments, some users may use Package Manager in local or cloud environments where the file system is writable, even if their production environment is not. Project Browser is the most obvious example of this, because installing modules is common when building a site and is often done locally. In Automatic Updates, updating manually via the UI (rather than automatically during cron) may be done by some users in development environments, which is beneficial even if Automatic Updates cannot be used directly in the production environment.

    To provide Automatic Updates to site even if the web server cannot write to the file system the module provides a symfony console command to run updates. In this case a outside cron job would be set up to run the console command perform and Automatic Updates. The codebase would have to writable by the system user setup to run the cron job but not the user running the web server.

  2. The executable composer.phar must be available somewhere on the system and runnable by the web server using proc_open (i.e., through the Symfony Process component). It's used to inspect the codebase and to install or update packages.
  3. The web server must be able to write to a temporary directory (i.e., the one returned by \Drupal\Core\File\FileSystemInterface::getTempDirectory), which must be outside of the web root. (Because Composer commands may fail, file system operations may fail, package downloads may fail, and so on, we run Composer commands not on the live site, but a copy of it.)
  4. Certain types of filesystem links cannot be present in the codebase → for example, hard links would prevent running Composer commands in isolation. See the NoUnsupportedLinksExist precondition documentation for details. The most common type of link is the symlink, which is supported, with one exception due to a PHP limitation: see also Symlinks to directories don't work with PhpFileSyncer.
  5. The Drupal site cannot be part of a multisite, due to the danger involved with changing the codebase of several sites at once.

Biggest risk: breaking the live site

Destructive Composer operations are never done in the live codebase - only in the staged copy created by Composer Stager. If a Composer operation fails there, the live site will be unaffected.

While unlikely, it is possible that while copying the files from the staged version of the codebase back into the live site, an unexpected error or failure could occur. This is hard to avoid because any filesystem operation will always has some chance of failing.

Package Manager takes several steps to ensure that users are informed about failed operations.

At the last possible moment before Package Manager asks Composer Stager to copy staged changes into the live site (thus overwriting the files of the live site), it writes a "failure marker" file, which it deletes after the copy succeeds. If Composer Stager raises an error during the copy, the "failure marker" file is NOT deleted. The presence of this marker file, then, indicates that the staged changes were only partially copied, and the site's codebase is corrupted. In this situation, the site's codebase should be restored from a backup; Package Manager will also flag a requirements error about this.

The "failure marker" file method is used instead of tracking state somewhere in the database, because it's an atomic operating system operation completely independent of Drupal's state.

Biggest challenge: testing

Testing Package Manager presents different challenges than everything else in Drupal core.

Among these challenges are:

  • Testing actual Composer commands for both non-destructive and destructive operations. To fully test the system and its many parts we needed to be able to test all those parts as realistically as possible, with actual, unsimulated Composer commands...without testing the internet.
  • Testing codebase replacement (i.e., the "apply" phase) without replacing the actual codebase that is running the test.

We tried several approaches to testing that we found did not work with the above challenges:

  • Relying on vfsStream in kernel tests: Composer commands don't work with VFS. It was also much harder to debug tests, because determining the state of the virtual "active" and "staged" directories was nearly impossible. Our kernel tests now use the real filesystem.
  • Altering Composer metadata files directly: In an effort to avoid bad performance from running many Composer commands in many tests, we were first altering the Composer metadata files (like composer.lock, installed.json, and even installed.php) directly to simulate changes made by Composer. This worked for a while, but during #3316368: Remove our runtime dependency on composer/composer: remove ComposerUtility we realized that our alternations were not sufficient to be usable by real Composer commands, and therefore made the tests unrealistic and hard to maintain.

Key components of our current testing infrastructure are:

  1. MockPathLocator: This class allows tests to have a directory other than the real Drupal codebase used as the "live" directory. This lets us freely alter the "live" directory without altering the codebase that is running the test.
  2. Valid Composer testing fixture: All of our kernel and functional tests that deal with the stage life cycle start with a "live" directory that is created by cloning a single test fixture. This test fixture is a valid Composer project that can be created using composer install by a development script that will be shipped with core. A test that runs composer validate on this fixture is also included to ensure any future updates to the fixture are still valid as far as Composer is concerned.
  3. FixtureManipulator: This class provides the ability to alter the active and staged codebases with real Composer commands. After each change, composer validatei s executed to make sure the changes did not have any damaging side effects.
  4. Functional tests: Because Package Manager is an API-only module, it only has one functional browser test. The testing infrastructure described above works with functional browser tests as well, though, as demonstrated by the Automatic Updates module's functional tests. We believe this proves that it is flexible enough to cover the future functional test requirements for when both Automatic Updates and Project Browser are added to Drupal core.
  5. Build tests: The test fixture used by kernel and functional tests is a valid Composer project, but is not a fully bootable Drupal site. For this reason, our build tests create a fully functional Drupal project, using the core templates, which updates its own code via Package Manager.

How Does it work?

At the center of Package Manager is the concept of a stage directory. This is a complete copy of the active Drupal code base, created in a temporary directory that isn't publicly accessible.

Package Manager's interaction with the stage directory happens in 4 phases during the stage life cycle:

  1. Create: A new stage directory is created and the codebase that is managed by Composer is copied into it. Any site-specific assets that aren't managed by Composer, such as settings.php, uploaded files, or SQLite databases, are omitted.
  2. Require: One or more packages are added or updated by Composer in the stage directory.
  3. Apply: The staged codebase is copied back into the live site.
  4. Destroy: The stage directory is deleted.

External dependencies

  • php-tuf/composer-stager: #3331078: Add php-tuf/composer-stager to core dependencies and governance — for experimental Automatic Updates & Project Browser modules This library allows Package Manager to run Composer commands in an isolated copy of the codebase. This is important because:
    • Running Composer commands directly on the live site would require the site to be offline for the entire time the Composer command is running. Using Composer Stager allows modules that use it to only put the site in maintenance mode during the copying of the files back to the live site (the "apply" phase).
    • Running Composer commands on a staged copy of the codebase allows us to inspect/analyze/validate the changes that have been made before they are copied into the live site. (For example, Package Manager ensures that any contrib Drupal projects that are changed in the staged codebase are secure and supported; see SupportedReleaseValidator.)
    • Composer Stager is owned and maintained by the Drupal community, but is developed on GitHub, outside of the Drupal namespace, to enable other PHP projects to also use it.
  • colinodell/psr-test-logger: Makes it much easier to test code that files log entries (mainly unattended updates during cron, which has no other way to report errors). See #3321905: Add colinodell/psr-test-logger to core's dev dependencies → already committed! (Note that the functionality of this package was previously available in psr/log, which is an existing dependency of Drupal core → that package removed this functionality in its latest major version, which was adopted by Drupal 10.)

Security: d.o's Composer package signing

  • Package Manager ensures that the site requires The Update Framework (TUF) to download packages from drupal.org's Composer endpoint. This means two things:
  • The PHP-TUF Composer integration plugin itself is completely independent of Drupal and, if enabled, alwaysenforces TUF protection for everything in opted-in repositories. That means if the site administrator runs Composer commands at the terminal, they get the same protection! It's important to note that it does NOT protect anything that belongs to a repository that is not opted in to TUF protection. So it will effectively only protect drupal/* packages, at least initially. (With the notable, current exception of the core packages, like drupal/core and drupal/core-composer-scaffold, which are part of the main Packagist repository that doesn't currently have TUF protection.)
  • See #3325040: [Packaging Pipeline] Securely sign packages hosted on Drupal.org using the TUF framework and Rugged for the ongoing work to deploy TUF signing for drupal.org-hosted packages.

Public API

The API of Package Manager can be broken down into these areas:

  1. Stage life cycle events: Modules can subscribe to the events dispatched during the stage life cycle. There are Pre- and Post- events for all the phases of the stage life cycle. Any subscriber to the Pre- events can flag validation errors that will stop the life cycle from proceeding until the errors are resolved.
  2. Package Manager provides some useful services for analyzing and comparing the state of the live and staged codebases; in particular,PathLocator and ComposerInspector.
  3. The Stage class creates the stage directory and performs the stage life cycle phases, Create, Require, Apply and Destroy as described above. Modules that want to perform other, specialized Composer operations should extend the Stage class.

For a more detailed explanation of the API see package_manager.api.php in the merge request.

Dependency evaluation

Policy Questions

  1. ⚠️ core issue ⚠️#3349368: [policy, no patch] How much of The Update Framework integration is needed for alpha-level review/commit of Package Manager?
  2. ⚠️ core issue ⚠️#3385644: [policy, no patch] Consider whether to keep Package Manager and Automatic Updates in a separate repo/package than core in order to facilitate releasing updates to the updater

How to help

Please DO NOT push to this branch!! Because this merge request is still being automatically converted from the 3.0.x version of the Automatic Update contib module where Package Manager is sub-module. Any changes made directly to this MR will likely be lost in automatic conversion process.

Feel free to leave feedback on this merge request. If you would like to help address the feedback please search the 3.0.x issue queue for the contrib module to see if any existing issue exists and if not create one in that project.

Issue fork drupal-3346707

Command icon Show commands

Start within a Git clone of the project using the version control instructions.

Or, if you do not have SSH keys set up on git.drupalcode.org:

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

tedbow created an issue. See original summary.

tedbow’s picture

Category: Bug report » Feature request

Wim Leers made their first commit to this issue’s fork.

Wim Leers’s picture

The 3 failures on March 8:

  • Drupal\Tests\package_manager\Kernel\PathExcluder\NodeModulesExcluderTest → legitimate failure → fixed by my commit just now, the test already passed! 😊
  • Drupal\Tests\ckeditor5\FunctionalJavascript\MediaTest::testViewMode() → random failure

Next: porting all commits that have happened in the contrib module in the past ~60 hours, which includes the reverting of the absurd PCRE limit work-around thanks to #3346628: FixtureManipulator should only use Composer commands, rather than manipulating JSON directly and the removal of the runtime dependency on composer/composer, which is why the diffstat for the last commit is a whopping 33 files changed, 660 insertions(+), 1381 deletions(-)! 🤓🥳

We have only a few alpha blockers left at #3319030: Drupal 10 Core Roadmap for Automatic Updates, and #3321905: Add colinodell/psr-test-logger to core's dev dependencies + #3331078: Add php-tuf/composer-stager to core dependencies and governance — for experimental Automatic Updates & Project Browser modules are sibling core issue that will add the 2 new dependencies that the Package Manager module needs!

Wim Leers’s picture

Ted is working on an issue summary update! 🚀

tedbow’s picture

Issue summary: View changes
Status: Active » Needs review

Adding the actual issue, summary no longer a place holder.

Note: As is mentioned in the summary, we will continue to automatically convert the contrib module to commits on this merge request.

So this issue is the place for reviews but will be making issues and commits on the contrib module to address that feedback. Those commits will automatically show back up here through the conversion

phenaproxima’s picture

Issue summary: View changes
tedbow’s picture

Issue summary: View changes
tedbow’s picture

Issue summary: View changes

markup

tedbow’s picture

catch’s picture

General questions about this bit:

The codebase must be writable by the web server. Because this is intrinsic to the purpose of this module and the two modules that depend on it, this requirement is unlikely to change in the future. Although this will prevent Package Manager from working on some hosting environments, some users may use Package Manager in local or cloud environments where the file system is writable, even if their production environment is not. Project Browser is the most obvious example of this, because installing modules is common when building a site and is often done locally. In Automatic Updates, updating manually via the UI (rather than automatically during cron) may be done by some users in development environments, which is beneficial even if Automatic Updates cannot be used directly in the production environment.

With automatic updates, it's my understanding that unattended updates will be done on cron - this means potentially a different user to the webserver running cron with access to the filesystem, and webserver write access not being entirely required for that situation.

Can automatic updates be configured like that - unattended updates via non-web cron only?

If so, would it be feasible at some point to have queued installs and updates - i.e. you select what you want, it goes into a queue, cron runs the queue (this might need to be something like key/value expirable instead of a real queue so we could show pending status), package_manager installs your stuff, it shows up later. It would require a frequent server-side cron run but pretty sure cpanel and similar can do that. This might make things available to a slightly wider number of users although a fair bit of UX to overcome.

tedbow’s picture

#13 @catch
Right now that is not possible but I have created a contrib issue for this functionality #3351895: Add Drush command to allow running cron updates via console and by a separate user, for defense-in-depth. It should not be that difficult. I think we could do that via a Symfony Console command. This command would need to bootstrap Drupal as any module can provide event subscribers that validator operations as described in the summary. I have not kept up with the commands in core/lib/Drupal/Core/Command but it looks like bootstrapping in done here \Drupal\Core\Command\ServerCommand::boot so maybe this would not be something new for core.

I think #3351895 would not actually need any changes to Package Manager and only Automatic Updates but I guess we will see.

If so, would it be feasible at some point to have queued installs and updates - i.e. you select what you want, it goes into a queue, cron runs the queue

Yes that seems like it would be possible though of course Package Manager does not have this queue now but the queuing like this could also be done by Automatic Updates itself.

Additionally we could even do the staging of the install/update into the stage directory at the time the user does the selection. The cron operation could just do the apply() phase. This would have the advantage of being able to validate any problems that Package Manager or other modules check for and let the user know if there are problems that would stop the operation. For example the user might want to install Module X but because of complex Composer requirements that would actually force Module Y to update to a insecure version(Package Manager prevents this). Therefore user would see this right away and not have to wait till the cron job runs to try do this.

This also could be done in either Automatic Updates or Project Browser separately but eventually it might be better if some functionality lived in Package Manager. I would think this functionality would be more important in Automatic Updates than in Project Browser because this would allow more site to select security updates.

tedbow’s picture

Assigned: tedbow » Unassigned
needs-review-queue-bot’s picture

Status: Needs review » Needs work
FileSize
90 bytes

The Needs Review Queue Bot tested this issue. It no longer applies to Drupal core. Therefore, this issue status is now "Needs work".

This does not mean that the patch needs to be re-rolled or the MR rebased. Read the Issue Summary, the issue tags and the latest discussion here to determine what needs to be done.

Consult the Drupal Contributor Guide to find step-by-step guides for working with issues.

Wim Leers’s picture

#13:

This might make things available to a slightly wider number of users although a fair bit of UX to overcome.

Absolutely!

Is this a nice-to-have or a must-have?

Ideally we'd have (approximate) numbers for this. We probably can't get that. But maybe we have some experience with it? Do you think it's more like 1, 10 or 50%?


#14:

Additionally we could even do the staging of the install/update into the stage directory at the time the user does the selection.

I was first gonna disagree, but no, this is totally right, because the creation of the stage does not require the ability to write to the codebase, only to the filesystem (similar to just uploading files). That'd definitely make the UX concerns @catch raised less severe, because feedback would be happening together with user actions, instead of at a later time.

I would think this functionality would be more important in Automatic Updates than in Project Browser because this would allow more site to select security updates.

IMHO what @catch described would only provide an acceptable UX for Automatic Updates, not for Project Browser. Project Browser IMHO requires the ability to see the results of what you're doing at the time you're doing it, otherwise it'd be a nightmarish UX.

dww’s picture

The requirement that the web server can write its own files is quite a bummer. We spent considerable effort ~15 years ago in Update Manager to let it work on sites that the web server can write to, and on ones that are more securely configured. For all the effort we're pouring into making a slick, new, "more secure" version of all this plumbing -- to have to tell everyone "but open yourself up to a bunch of wider escalation vulnerabilities by making sure your webserver can write to its own source code" seems like a major step backwards. 😢

I guess we don't really know the actual numbers of how many sites are more securely configured or not. Certainly the UX is more slick and easy if everything is writable. And if folks are really security conscious, they're probably not going to use any of this. So, to help with the long tail of sites not upgrading, most of them are probably on server-can-write configurations and it doesn't really matter. But it's unfortunate we can't play similar tricks in Package Manger that we are in Update Manager to have "sudo" step if the server can't directly write to its own files...

ressa’s picture

This is a bit of an unsettling sentence:

And if folks are really security conscious, they're probably not going to use any of this.

That makes it sound to me like Automatic Updates will be (more or less) just a toy? In my mind, using Automatic Updates should offer the highest level of security, and not require an insecure set up.

If it does ... well then it kind of defeats the whole purpose of installing it in the first place, as I see it. Or maybe you can't have one (Automatic Updates) without the other (insecure set up)?

tedbow’s picture

Issue summary: View changes

Automatic Updates use case with protected file system

Re #16 to handle the case where the file system is not writeable by the web server we are working on #3351895: Add Drush command to allow running cron updates via console and by a separate user, for defense-in-depth. It would be for Automatic Updates cron updates use case not direct operations through the UI as provided by Project Browser. Of course this would require setting up and outside cron job

I guess we don't really know the actual numbers of how many sites are more securely configured or not.

This is hard to tell and of course our numbers of which versions of Drupal people are running is dependent on those sites using the Update module but I looked at reported usage recently and found

Here are numbers for sites running secure and insecure versions of Drupal:
Secure 9.5 70,489
Insecure 9.5 84,904
Secure 10.0 9511
Insecure 10.0 9,797

So in both case more sites are running insecure than secure versions of Drupal core.
Of those sites running insecure versions some probably aren't configured to have file system protected anyways so having cron Automatic Updates would probably be a security improvement.

Of the sites where they are running insecure versions and the file system is protected it maybe a choice of whether those admins want to make the files system writable as way to keep core up to date. I talked with a few agencies where they just cannot afford to keep the sites for their many clients up to date and would be willing to configure those hosting to be compatible with Package Manager because they think keeping the sites on secure versions are more important. But my discussions were before we considered #3351895 so in that case the file system could still be protected but a outside cron job would have to be set up.

With @catch's idea in #13 It might also be possible to provide a UI to select Updates that would then be installed via cron. With specific command as suggested in #3351895. The lag between selecting the updates and having them installed could be very minimal as the server cron job that runs the Updates could be configured to run very frequently as it would not do anything if nothing had been set to update.

Project Browser use case

I know that Project Browser has been proposed at least partially as a local development tool so it that case the writable file system is probably not a problem.

But of course people will want to run it on production and I don't think it is explicitly documented that you should not do that. See related discussions #3352913: Should maintenance mode be suggested or enforced??

So it would run into the same problems. I am not sure the idea to queue installs would work as well for Project Browser as people probably want immediate feedback.

Providing credentials through the form

re #18

But it's unfortunate we can't play similar tricks in Package Manger that we are in Update Manager to have "sudo" step if the server can't directly write to its own files...

I am sure many other people aren't familiar with how this works in the Update module but if people want to know more see sites/default/default.settings.php and search for "Authorized file system operations:" or if you want delve into the core `core/modules/update/update.authorize.inc`, core/authorize.php and \Drupal\Core\FileTransfer\Form\FileTransferAuthorizeForm. But we don't need to go into the details here.

Basically a user is given a form where they enter either other credentials for SFTP or SSH to transfer files of the update(or install I think).

I don't think it is impossible that we could do something like this although I don't think the API provided by \Drupal\Core\FileTransfer\FileTransfer would work for us. We might be able to us some for the current code.

But I think before we spend too much time deciding on whether could do this technically I think we figure out if we should.
I think that is product level decision:

Would it be preferable that if the web server can't write to the codebase that the user should be prompted to enter in server credentials that would allow this?

I don't think that we should assume that because this worked well for Drupal 15 years ago that we want to do this again. Unfortunately I don't think there is any way for us to know how many sites are using this part of the Update module because many sites are probably just using the Update module to get the list of available updates.

The current system in the Update module is not Composer aware so therefore it would break many sites if it installed or updated modules needed new or updated Composer dependency. So that is one reason to think it is probably not in wide usage for Drupal 8 and above. I am not sure if we can get better numbers on this. I am guessing just anecdotal

To be clear we could probably add the functionality to prompt users for credentials as an addition later. Though I am not sure we would ever want to do this for Cron Automatic Updates. I think the current system in the Update module is made to use the credentials in the current operation the user is attempting no to keep the credentials around forever as would be required for cron updates.

dww’s picture

Thanks for the detailed feedback, @tedbow! A few quick thoughts for now:

  1. Definitely do NOT want Drupal to ever save such credentials for later use. That would 💯 defeat the purpose of defense-in-depth here. 😂
  2. I'm not necessarily proposing we need to recreate the authorize.php experience, or re-use any of that code. I'd be shocked if the "API" (such as it is) that we came up with 15 years ago would actually work for these new use cases.
  3. If the run-a-command-via-cron-as-the-"super"-user approach is easy enough to include, that'd be great. For sites that care enough about security to have the files owned by a separate user, they can probably figure out how to RTFM enough to setup such a cron job. We probably don't need a UI-based version of this at all anymore. But if it's relatively easy to figure out how such a UI would work and fit in, and build all the plumbing (and counter-tops) now such that we could bolt such a sink into the house at a later time, great.
  4. I'm not totally convinced that this is a no-go for Project Browser. The UI will already not be "instantaneous" -- we've gotta download things from the Internet, after all. There's already going to be a "waiting while we download (and have composer fetch all the dependencies) step. If "waiting for the download to start" takes 0-60 seconds (until the next cron job fires), that's not going to be the end of the world. I think it'd be worth building such plumbing into the PB workflow, too, and not immediately rule-out defense-in-depth with the explanation that "users will expect immediate results"... It's already not going to be immediate, the UI will have to handle that gracefully...

Gotta run, more later (I hope).

Thanks again!
-Derek

effulgentsia’s picture

Ideally we'd have (approximate) numbers for this. We probably can't get that. But maybe we have some experience with it? Do you think it's more like 1, 10 or 50%?

I don't have data to back this up, but here's my hypothesis. From small/low-traffic sites to large/high-traffic sites, I think we have the following hosting categories:

  • Generic shared hosting
  • Specialty shared hosting
  • Single server VM, VPS, or dedicated hosting
  • Multi-server

I think the first category is the largest in terms of number of sites, and I think the vast majority of sites in this category already have a PHP-writable codebase, because the shared hosting provider gives them a single user account that's the same for ssh, sftp, and php.

For the second category, an example of specialty shared hosting might be a university with an IT department managing a pool of servers, but where each department or team can get its own website. Such an IT department might be savvy enough to provision each internal website owner with 2 user accounts: one for ssh/sftp and a different one for php.

For people running on VMs (e.g., Digital Ocean droplets), VPS, or dedicated, Drupal's docs should recommend that they set things up so that the codebase isn't writable by the web server. Even if this is a smallish percentage of total Drupal sites by number, it's an important audience, and it would be good to not steer this audience towards the less secure setup of having a webserver-writable codebase. I think this is the audience we should be thinking about for #13.

For high availability or high traffic sites that require balancing load across multiple web servers, the automatic updates module proposed for Drupal core can't be used on its own. The site would need its own deployment process to make sure the code on all web nodes is kept in sync. However, the site could setup a staging server that runs automatic updates and then triggers the site's deploy-to-prod process. Depending on the other security checks involved in that process, it might or might not be important for that staging server to have its codebase be non-writable by the web server.

dww’s picture

@effulgentsia: That's a helpful way to categorize things in #22, thanks.

FWIW, I've been using DreamHost as my "Generic shared hosting" solution for many years. It's definitely a big player in that space. One of the key features that drew me to it in the first place was that it does let you create different users for different things, and although it takes a little extra work, you can (and I do) setup sites there so that the files are owned by a different user than httpd runs as. I'm sure I'm in the minority on that approach, but pointing out that even some of the "Generic shared hosting" solutions can still be targets for making this work (in some way) on more secure-in-depth sites.

tedbow’s picture

@dww re #23 So in your case on DreamHost would the hosting also allow setup if *nix style cron jobs via Cpanel or otherwise that would also allow updating via the server cron job triggering a AutoUpdates console command(once it is built)?

dww’s picture

Yup. You can ssh into a host and use crontab directly. I never mess with cpanel. So yes, in principle, this would all work there with the separate cron job…

catch’s picture

Is this a nice-to-have or a must-have?

#3351895: Add Drush command to allow running cron updates via console and by a separate user, for defense-in-depth might be must-have (although I'm not sure why running regular Drupal cron via drush and switching off automated cron isn't enough to cover that?). I don't think any of the rest of this is. However it might be necessary to get x% extra sites to adopt automatic updates.

IMHO what @catch described would only provide an acceptable UX for Automatic Updates, not for Project Browser. Project Browser IMHO requires the ability to see the results of what you're doing at the time you're doing it, otherwise it'd be a nightmarish UX.

If I install something via the UI via Ubuntu's package manager, or via google play on my phone, I usually hit 'install', start doing something else, and then somewhere between 15 seconds to 3 minutes later get a notification that the app was installed. If I'm installing over 4g sometimes it has to give up once or twice and restart and takes a lot longer. You do get the 'percentage complete' stuff but as we all know that's a lie anyway. The fact that it happens in the background is a major feature in fact. This is also the case for manually triggered updates on those environments too. So it's a pretty standard pattern that installs and updates get queued, and eventually they get done, and then you can check back and/or get a notification that it happened. It's also IMO a better UX than 'tough, you can't use this unless you change your file permissions' and it's also a much better UX than when installs and updates used to block all other operations back in the day.

@effulgentsia:

I think this is the audience we should be thinking about for #13.

Yes this is also broadly what I have in mind - people who aren't on call to update their website on Wednesday afternoons, but can follow 'advanced' instructions step by step to get their server/Drupal configured decently. Or just people who might or might not use automatic updates, won't mind configuration it to work on their system (like adding a cron job and adding something to $settings or similar), but won't want to specifically configure their entire system to work with automatic updates.

Wim Leers’s picture

#22: insightful analysis, thanks @effulgentsia! 😊

#26:

although I'm not sure why running regular Drupal cron via drush and switching off automated cron isn't enough to cover that?)

Agreed. But I suspect @dww's reasoning is that drush should not be required to be installed on the web server, Drupal core alone should be sufficient? @dww, can you elaborate? (I'd really rather not be guessing!)

The fact that it happens in the background is a major feature in fact. This is also the case for manually triggered updates on those environments too.

Fair! But a key difference is that Drupal does not have UI concepts in place for keeping the user informed of these kinds of things… there's never been a concept of "background tasks" and keeping the user informed of that. Although one could argue that that is what cron does, and then a Drupal status message would be the equivalent of a notification? (Easily missed/ignored though 😅)

Is that what you had in mind?

catch’s picture

We might want to split this bit off to its own issue, but keeping going here just for now given it's currently pretty linear.

Although one could argue that that is what cron does, and then a Drupal status message would be the equivalent of a notification? (Easily missed/ignored though 😅)

Yes something like that. If we don't use queue itself, but instead key/value or similar, then we could have a constant message telling you the update/install is pending (maybe only on the project browser/automatic updates UI) and an extra message telling you it's done. We could keep something in session referencing the pending operation(s) then check status based on that, so that the logic only needs to run for the user who triggered it + maybe for everyone on the project_browser/automatic_updates UI pages in the case of multi-admin installs.

If you miss/ignore the notification, does package_manager (or project browser/automatic updates) maintain logs for what it's done? It seems useful especially for automatic updates to be able to easily see that x module was updated from x.y.y to x.y.z release (on the site rather than an e-mail), and then that could also show installs. Since updates might trigger installs too when dependencies are added, any such log probably needs to show both anyway.

dww’s picture

I don’t exactly care about drush cron vs a dedicated script, and I’ve never raised any such concern.

I’m raising the flags that designing Package Manager to not handle the non-writable file system case is a mistake (that we can still correct) and that saying “but even if we did, it’ll never work for Project Browser because users expect immediate feedback” is misguided since nothing about Project Browser will actually be “immediate”, anyway. I’m advocating (like I have for many years) that all these update-your-site tools should work when a site is configured “correctly” 😉 to not let httpd write to its own files.

dww’s picture

FileSize
2.56 KB

I posed the question about the Project Browser UI having a "waiting for download to start" phase, and got confirmation that:

  1. The PB UI already has a progress spinner / waiting for stage X.
  2. Adding another phase would be a small incremental change to the existing UI, not a whole new kind of problem for them to solve.

So I believe there's no reason to claim PB can't work with defense-in-depth, too. It's already not instantaneous, and another phase isn't a game changer for their UI. But supporting the non-writable filesystem case would be a game changer for that (subset?) of users who configure their sites "correctly". 😅

Slack transcript attached (with permission). https://drupal.slack.com/archives/C01UHB4QG12/p1681848852629629

Thanks,
-Derek

Version: 10.1.x-dev » 11.x-dev

Drupal core is moving towards using a “main” branch. As an interim step, a new 11.x branch has been opened, as Drupal.org infrastructure cannot currently fully support a branch named main. New developments and disruptive changes should now be targeted for the 11.x branch, which currently accepts only minor-version allowed changes. For more information, see the Drupal core minor version schedule and the Allowed changes during the Drupal core release cycle.

tedbow’s picture

Status: Needs work » Needs review
Issue tags: +Needs issue summary update
  1. @catch re #26

    #3351895: Add Drush command to allow running cron updates via console and by a separate user, for defense-in-depth might be must-have (although I'm not sure why running regular Drupal cron via drush and switching off automated cron isn't enough to cover that?).

    We have since removed the drush command(since core can't rely on drush and did #3360485: Add Symfony Console command to allow running cron updates via console and by a separate user, for defense-in-depth

    This allows setting up a server cron tab to run this new command as a more privileged user and keeping the codebase write protect from the user that the web server is running as

  2. Setting this issue as Needs Review. The currently failing test is GenericTestExistsTest because we don't have this for package_manager. I need to figure out how to add this file to the merge request without having it be in the contrib module as it would fail in 10.1.x and 10.0.x tests
needs-review-queue-bot’s picture

Status: Needs review » Needs work
FileSize
90 bytes

The Needs Review Queue Bot tested this issue. It no longer applies to Drupal core. Therefore, this issue status is now "Needs work".

This does not mean that the patch needs to be re-rolled or the MR rebased. Read the Issue Summary, the issue tags and the latest discussion here to determine what needs to be done.

Consult the Drupal Contributor Guide to find step-by-step guides for working with issues.

tedbow’s picture

Status: Needs work » Needs review

Tests passing🎉

Gitlab tests are not passing because our tests need rsync so we will need to update .gitlab-ci/pipeline.yml to add rsync

But I think this is ready for review as the drupalci testing with rsync shows this pass if available.

tedbow’s picture

Issue summary: View changes
Issue tags: -Needs issue summary update

Added info about the symfony console command

needs-review-queue-bot’s picture

Status: Needs review » Needs work
FileSize
90 bytes

The Needs Review Queue Bot tested this issue. It no longer applies to Drupal core. Therefore, this issue status is now "Needs work".

This does not mean that the patch needs to be re-rolled or the MR rebased. Read the Issue Summary, the issue tags and the latest discussion here to determine what needs to be done.

Consult the Drupal Contributor Guide to find step-by-step guides for working with issues.

tedbow’s picture

The current merge request is not up-to-date with the work in the contrib module

There is automated script to convert module it but it is still a fair amount of work. For this reason and because we aren't actually getting reviews, nobody but me has commented since April, I am going stop running conversions.

If you want to review the code I would suggest reviewing the contrib module which the MR here has always been a automated conversion of. https://www.drupal.org/project/automatic_updates

When core reviewers have time especially the product, release and framework managers and if you would like to review the module here instead of the contrib module please contact me or comment here and I can run the conversion again.

I am tempted to postpone but I won't for now

tedbow’s picture

Issue summary: View changes
lauriii’s picture

Issue summary: View changes
catch’s picture

MockPathLocator: This class allows tests to have a directory other than the real Drupal codebase used as the "live" directory. This lets us freely alter the "live" directory without altering the codebase that is running the test.

Without reviewing all the test coverage in detail, the first question that comes to mind here is 'why not use build tests for this?', see for example ComponentsIsolatedBuildTest - was this attempted? Is there an issue I can look at or documentation for why not? I see a couple of build tests, but they aren't using MockPathLocator which seems to be all for kernel tests.

phenaproxima’s picture

Re #40: Package Manager has unique needs -- we need to move files around and simulate a "fake" Drupal site, but we also need to be able to test the APIs directly. Build tests are too "far" from the APIs to be useful for this; kernel tests do not, by default, deal with real files in a filesystem.

So, rather than try to get a build test to boot up a kernel in a fake site that isn't bootable, we ended up adapting kernel tests so that they can use Package Manager's APIs with a small simulacrum of a Drupal site based on real files. That's what almost everything in PackageManagerKernelTestBase and package_manager_bypass are there to facilitate. It's simply an easier lift than doing it the other way around.

tedbow’s picture

@catch also emphasize what @phenaproxima mentioned this is used in all of our kernel tests(also Automatic Update's)
We have 375 kernel test cases so this allows us to write the kernel tests to be simpler and more isolated than we would build test where would in a build test where even with gitlab being faster I think we would still have to worry about how long 375 build tests would take

If you wanted to look further \Drupal\Tests\package_manager\Kernel\PackageManagerKernelTestBase::createTestProject is where we create the test project that mock path locator will be pointed to.

In production we use \Drupal\package_manager\PathLocator as a service that tell us where the vendor directory, project root, webroot, and staging root is. We then swap this service in kernel test for the MockPathLocator

We do of course also have build test where don't do any of this mocking. Since package manager doesn't have an API we test via simple Drupal control that calls our APIs. We also have build tests Automatic Updates that test our UI, cron, automated cron, and the console command also without swapping any of services.

catch’s picture

So, rather than try to get a build test to boot up a kernel in a fake site that isn't bootable, we ended up adapting kernel tests so that they can use Package Manager's APIs with a small simulacrum of a Drupal site based on real files. That's what almost everything in PackageManagerKernelTestBase and package_manager_bypass are there to facilitate. It's simply an easier lift than doing it the other way around.

This sounds like a good explanation - I'll try to take a closer look soon to get my head around it a bit more.

tedbow’s picture

Status: Needs work » Needs review
Issue tags: +no-needs-review-bot

I am changing this to need review. The bot changed it to needs work in #36

The tests are not passing because we had been rely on the drupalci testing as we weren't sure in the process when core would switch to gitlab.

I think it is still reviewable all the tests pass pre-conversion in the contrib and were passing here before the switch to only gitlab

Since the switch was made we are working this #3394413: Enable GitLab CI, and use it to build out a core code base with Automatic Updates and Package Manager as core modules

catch’s picture

It's very hard to tell just from the issue summary what the remaining work is here.

#3319030: Drupal 10 Core Roadmap for Automatic Updates is now linked (it wasn't previously), but specific issues which will result in changes to package manager prior to an alpha commit, or core bugs that are blocking PM should ideally be tracked directly in this issue - even after reading the roadmap issue, it's not clear to me which ones block PM, which ones block AU, which core bugs are just bugs that affect the modules, or which will unblock changes AU to PM that are still necessary etc.

tedbow’s picture

Issue summary: View changes

Re #45, I added the links to core dependency and policy questions. It doesn't seem like #3385644: [policy, no patch] Consider whether to keep Package Manager and Automatic Updates in a separate repo/package than core in order to facilitate releasing updates to the updater but I added because it is still open and I don't think this something we would want to do later. So I think this should be closed if there consensus in core governance to do this

smustgrave’s picture

So this seemed to stalled, is there any help needed? Don't have the role to make any of the calls though.

smustgrave’s picture

@catch wonder if you have an answer or know who to ask regarding #46?

catch’s picture

smustgrave’s picture

So does that mean this should continue? Definitely needs rebasing.

effulgentsia’s picture

I think the best next step to move this issue forward is to get reviews of the code in https://github.com/php-tuf/composer-stager. @TravisCarden just recently (2 weeks ago) released 2.0.0-beta4. It's only in beta status because it hasn't been reviewed much by people other than those working on Package Manager. As far as we know though, it's feature complete and can be marked RC or stable once it's adequately reviewed. The most noteworthy recent change is that per #3416542: [policy] Require rsync for automatic updates in Drupal core and punt other syncers to contrib we removed the PHP filesyncer so now only works if people have rsync installed. This makes the package easier to review and easier to maintain. If there's a significant population who's on systems that don't already have rsync installed and can't install it, then a separate GitHub repo/package could be created (if someone were willing to maintain it) containing alternate file syncers (e.g., one that uses the Windows robocopy command for people on Windows), but those would not be "part of core" (i.e., maintained by Drupal core committers). The Drupal issue for discussing stuff related to Composer Stager is #3331078: Add php-tuf/composer-stager to core dependencies and governance — for experimental Automatic Updates & Project Browser modules. Issues and PRs could also be filed in the GitHub repo.

Beyond the above, @tedbow: it would be good to get an updated MR of Package Manager into this issue.

tedbow’s picture

Assigned: Unassigned » tedbow

Ok I will update with the latest changes from the contrib module

tedbow’s picture

Assigned: tedbow » Unassigned

Now that I see #3432860: Update to Symfony 7.0 I realize that Composer Stager would have to be updated to Symfony 7. I made an issue for this https://github.com/php-tuf/composer-stager/issues/350

I assume there is no way Automatic Updates is going to get into Drupal 10.x at this point. Is that correct?

I am not sure how long the Composer Stager work will take. If it is going to take a while would it be useful for me to push up a version of this MR with the Package Manager module without Composer Stager as a dependency? All the tests would fail but it would still be latest code and could be reviewed.

effulgentsia’s picture

Status: Needs review » Needs work

https://github.com/php-tuf/composer-stager/issues/350 got completed so now Composer Stager is compatible with both Symfony 6 and 7, so I think the MR in this issue should include it so that tests run properly.

tedbow’s picture

Status: Needs work » Needs review

I have pushed the latest changes. The build tests will still fail in this version. I will try figure that out. It is bit difficult to get them passing in both contrib version and core version as the gitlab templates are bit different but should be doable.

UPDATE: Build tests should be passing now

catch’s picture

Status: Needs review » Needs work

This needs another rebase - conflicts in composer/Metapackage/CoreRecommended/composer.json

Did a not at all comprehensive first pass review to try to get things moving here.