Postponed: Waiting on #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc).

Problem/Motivation

Last week, at Drupal Dev Days Montpellier, we (@klausi, @fago, @derhasi, @fubhy, @bojanz, @mariancalinro, and @Xano) had a long and fruitful discussion about managing Composer dependencies for modules in Drupal 8. We learned a lot, but also discovered that allowing modules to be placed into applications the 'old' way (putting a package in a modules folder) will inevitably lead to critical failures (fatal errors if required classes do not exist) if those modules require Composer packages.

As Composer Manager is only a temporary solution (I will yield the floor to @bojanz here, as he can explain its limitations much better than I can), we realized that Composer-driven development is possibly the only solution that prevents people's sites from breaking badly because of missing dependencies. This means that Drupal applications can no longer be built by putting contributed modules in a modules folder, but that the application requires one primary Composer file that manages all dependencies, including Drupal core and contributed modules.

There are several reasons why we should allow modules to specify dependencies through Composer:

  • Getting off the island (better integration with other PHP systems)
  • Reducing maintainers' workloads by allowing them to reuse much more existing code than they've been able to use before. This reduces development time, increased quality, and could go a long way to prevent burnouts.

This proposal leaves the functionality to enable and uninstall modules through the module handler (and therefore the administrative UI) intact. It only covers the approach we use to assemble the code base.

Proposed resolution

Remaining tasks

To be determined.

User interface changes

API changes

Modules can no longer simply be downloaded and placed in a modules directory, but have to be placed within a Drupal application to Composer.

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

webflo’s picture

Issue tags: +Composer
lauriii’s picture

I do agree there is changes that needs to be addressed to make Drupal 8 supporting composer dependencies in modules properly. Before moving forward we need to make sure that there is no solution with less regression for APis and UI. Maybe we could take something from the composer manager so that the modules couldn't be installed if the module doesn't meet its dependencies.

Problem we have on this issue is that almost anything we do is going to break the release cycle rules because we are already in beta. However what we have at the moment will cause people breaking their sites which on the other side isn't very nice and has to be fixed somehow. There is already large modules that has commited to work this way.

We should also discuss how this should work with themes. I think themes should also be able to have dependencies to external libraries.

webflo’s picture

Contao CMS uses composer as primary source to download and update extensions. Maybe we could explore a similar solution? https://c-c-a.org/ueber-composer

David_Rothstein’s picture

Status: Active » Postponed (maintainer needs more info)

We learned a lot, but also discovered that allowing modules to be placed into applications the 'old' way (putting a package in a modules folder) will inevitably lead to critical failures if those modules require Composer packages.

This issue does not explain what those critical failures are. Can you elaborate?

If a module has external dependencies, it should implement hook_requirements() to prevent the module from being installed until those external dependencies are in place. (This is independent of the method that the site uses to add the dependencies to the codebase, whether it's Composer or any other method.) So I am having trouble seeing what the bug here would be...

Xano’s picture

Issue summary: View changes
Status: Postponed (maintainer needs more info) » Active

This issue does not explain what those critical failures are. Can you elaborate?

Missing dependencies, which causes fatal errors.

If a module has external dependencies, it should implement hook_requirements() to prevent the module from being installed until those external dependencies are in place. (This is independent of the method that the site uses to add the dependencies to the codebase, whether it's Composer or any other method.) So I am having trouble seeing what the bug here would be...

We cannot require developers to implement hook_requirements() for every single module of theirs that has Composer dependencies. This would require a lot of custom code to analyze the Composer status of the project. Simply checking whether certain classes exist is not enough to make sure the correct versions of dependencies are all installed, or to make sure that the dependencies' dependencies are installed. Writing such custom code would essentially mean we bypass Composer in order to do what Composer already does.

dawehner’s picture

Issue tags: -Release blocker

Its clear that Drupal 8 modules will use much more composer dependencies than any kind of contrib module did before relied on any PHP library,
this is a great thing we should encourage.

Missing dependencies, which causes fatal errors.

At the same time, I totally agree with David, that this is not a new problem, modules had to deal with their missing dependencies and hook_requirements() was directly invented
for that problem. It also then rises the question why should it be a release blocker, I would 100% agree that this is a major issue.

Simply checking whether certain classes exist is not enough to make sure the correct versions of dependencies are all installed, or to make sure that the dependencies' dependencies are installed.

Well, it at least fixes a big portion of the problem. I think in D7 contrib modules also did not checked the actual version library, but just the existance.

I'm curious whether core can provide helpers so that contrib modules have an easier time validating the version of an installed dependency.

Removing the release blocker tag as its part of the criticality anyway.

webchick’s picture

Modules can no longer simply be downloaded and placed in a modules directory, but have to be placed within a Drupal application to Composer.

I object to this approximately 85,000%.

I'm glad that Composer is a fantastic tool for developers who are comfortable using it to manage dependencies. But Drupal is not a tool (solely) for developers. In fact the entire point of using Drupal over a framework like Symfony or Silex or Laravel is that it allows non-developers to do amazing things. We cannot break this fundamental strength of Drupal by requiring non-developers to figure out Composer, esp. on e.g. Windows machines.

hook_requirements() it is. Or, fix the d.o packager so that it recognizes composer stuff and pulls that in. Or whatever. But this requirement should not be shoved onto our end-users (site builders), most of whom are not developers.

webchick’s picture

Priority: Critical » Major

And since there is a workaround (use hook_requirements), don't see anything here that would meet the definition of critical.

bojanz’s picture

So, here's my view of the problem (as the author of composer_manager's current d8 architecture and someone adopting Composer for Commerce 2.x).
Sorry for the wall of text. I promise it's worth it :)

Drupal's ecosystem grew up without any kind of dependency resolution. You always had two options, either download manually the modules that you need, or use Drush to do it. However, Drush still won't resolve your dependencies, you need to specify precisely which modules to get. The result of this technical limitation is that dependencies have always been frowned upon. A module with too many dependencies is a bad module. Duplication is encouraged to an extent. This is how we end up with an ecosystem of mega-modules (such as Commerce, Ubercart, Media, Panels, etc), simply because our downloader sucks (or we aren't even using it).
It also made it hard to use external libraries. Drupal.org still has modules and profiles that commit libraries directly into the repository, which is against the rules.
But if you follow the rules, that means telling your users to perform a manual download step before being able to use the module.

Then Composer happened, and Drupal core embraced it. Since then, it has been recognized (by the community and the Drush maintainers) as the tool that will replace Drush dl and Drush make for us, leaving us with the same workflow, but giving us dependency resolution. Of course, that won't have a transformative effect on our ecosystem until we actually start taking advantage of it. And starting with Commerce, Rules, and other modules, we will. The result of that is more dependencies (on other modules, packages, etc).

Of course, unlike Drush, Composer is not optional. You can no longer download a PHP library manually, place it somewhere, make it work. It needs to be done through Composer. So, tomorrow when you download Commerce from drupal.org, you realize that your archive is useless, because it requires additional lbraries you can't get manually. Installing the libraries (using Composer, or Composer Manager, or Embedded Composer, or whatever) requires going down to the command line and typing a command. No matter what you do, and how you reinvent the wheel, the end user needs to type a command (or click a GUI button that will result in the same process).

There is no workaround for this, you can't package a module with its dependencies, because dependency resolution is done on the site level. However, we can fix profile packaging to run composer (and will have to, of course).

So, without any kind of intervention from this issue, here's how our workflow will look, the same workflow all of PHP is using:
1) When you go to the Commerce page on drupal.org, you won't see releases, we'll hide them. Instead you'll see a 'composer require "drupal/commerce"' command listed, which you're supposed to run.
2) If you need to get started quickly, you will download Kickstart, which contains all of the modules and libraries prepackaged. However, you will still need Composer to add additional modules.
3) Downloading modules via update manager is a thing of the past.

And sure, it's fine to want to keep people out of the command line. We build a GUI (ala Acquia Dev Desktop) on top. The crux of the matter is not the command line, it's treating extension downloading as a process that happens outside of Drupal (the web UI). Which has been a best practise for years, we've just allowed people to skip it until now.

bojanz’s picture

Also, separate comment because I want to emphasize this, we don't need to change core or drupal.org right now.
But we do need to be aware that a change is happening which has big philosophical implications on Drupal.

And it sounds like we could use a BoF in LA.

klausi’s picture

If a module requires a Composer PHP component then people will have to use Composer to install that component. There is no way around that, because the autoload.php files from that D8 site using the module must be rewritten by Composer.

We expect that there will be quite a few D8 contrib modules that will have dependencies to PHP components (Drupal Commerce being the most prominent example right now), so site builders will have to use Composer sooner or later when they build a D8 site. I repeat: D8 site builders will have to know Composer or a tool build on top of it (Composer manager for example) to manage their autoloading.

I think writing hook_requirements() is a waste of time for a module that depends on a PHP component. The composer.json file in that module already specifies its requirements, so we would only repeat what Composer is already doing for us.

I expect the following to happen:
1) We complain that we cannot simply download modules and components and place them in a directory and be done with it.
2) We think about a plan how we could work around that and still make it work.
3) Nobody does the work because we are developers and know how to use Composer and have better things to do.
4) Drupal 8 gets released.
5) Site builders have a short WTF experience when they are forced to use Composer.
6) Site builders get over it and comfortably use Composer.
7) THE END

I'm not sure I agree with the issue summary regarding having no packaged downloads on drupal.org for modules at all. Modules without Composer dependencies (which will be most modules) can still be used fine by just downloading the tarball and placing the extracted folder in the modules directory. We could just disable packaged downloads for modules that have a composer.json file with dependencies committed to git.

Ideally Drupal core should check before enabling a module if it comes with a composer.json file and if the dependencies in it are satisfied. So Drupal core would ship with Composer and invoke it to check itself before enabling modules.

joelpittet’s picture

Ideally Drupal core should check before enabling a module if it comes with a
composer.json file and if the dependencies in it are satisfied. So Drupal
core would ship with Composer and invoke it to check itself before enabling
modules.

That sounds like a nice DX improvement there.

Mile23’s picture

Then Composer happened, and Drupal core embraced it

Not true. Drupal core halfassedly embraced it, which is why this is a problem.

Drupal modules can be placed exactly where they belong using composer-installers. You just have to declare your dependency on that plugin in the module's composer.json file and set the proper type. https://github.com/composer/installers

Composer-based dependencies (external libraries) can then be declared in the module's composer.json file.

The problem is that Drupal 8 doesn't do the right thing with Composer. For instance, it specifies two different vendor directories depending on how you install it.

The other problem is that so much energy has gone into unnecessary infrastructure like http://packagist.drupal-composer.org/ without putting any energy into doing the easy thing and making Drupal 8 use composer the right way. Want to help? Work on this: #2002304: [META] Improve Drupal's use of Composer and in particular this #2380389: Use a single vendor directory in the root which is held up by the testbot.

Removing the UI for installation is 100% the wrong way to go, especially in order to work around something for which there is more than one easy solution.

I blame maintainers for this problem, not the people trying to work around it by making stuff like drupal-composer.

That said, this issue is completely wrongheaded.

The proper solution is this:

  • If you install a module that has a dependency, you should get a notice in Drupal that the dependency is still unmet and that composer is needed to meet it. This comes from hook_requirement().
  • The user then goes to the command line and types 'composer install'. Because we will have fixed D8's use of composer, this will only add the library in question.
  • If you used drush to DL the module, drush should say something like, "This module has unmet dependencies managed through composer. Do you want to install these dependencies? (y/n)"
  • And you are done.

This is similar to the Libraries workflow, but MUCH EASIER. Less-sophisticated Drupal users will find this much better than figuring out which sites/libraries directory to download into and so forth.

Mile23’s picture

Ideally Drupal core should check before enabling a module if it comes with a composer.json file and if the dependencies in it are satisfied. So Drupal core would ship with Composer and invoke it to check itself before enabling modules.

Nope. Drupal should not care if contrib's dependencies are met. It should only be a CMS.

The module should use hook_requirements() to check if the dependencies exist, just as it works with Libraries right now. Only the module can know the hard requirements for the module, not Drupal.

Here's a prototype:

function hook_requirements($phase) {
  if (!class_exists('\Your\Dependency\Class')) {
    $requirements ['my_module'] = array(
      'title' => t('My Module'),
      'value' => t('You need Your\Dependency\Class to use this module. Install it with composer.'),
      'severity' => REQUIREMENT_WARNING,
    );
  }
}
klausi’s picture

Drupal is a CMS that allows enabling modules in the UI at runtime. It already checks if a module's dependencies to other modules are met (from the *.info.yml file), so ideally it should also check if a module's dependencies to other PHP components are met (from the module's composer.json file).

Mile23’s picture

@klausi #14: Once Drupal has used Composer to discover what hook_requirements() can already tell it, what will it do? Answer: Nothing.

hook_requirements() can show a message to the user saying something is wrong with the site, and it can also abort installation if that's needed. It's plenty flexible, maintains the UX for less-sophisticated users, and is basically the same workflow as Libraries, except much easier.

What's needed here is to design a set of desired behaviors, so that we can move on them instead of investing in technical debt everywhere but the one place it matters.

That conversation is happening here: #2002304: [META] Improve Drupal's use of Composer

Xano’s picture

I'm glad that Composer is a fantastic tool for developers who are comfortable using it to manage dependencies. But Drupal is not a tool (solely) for developers. In fact the entire point of using Drupal over a framework like Symfony or Silex or Laravel is that it allows non-developers to do amazing things. We cannot break this fundamental strength of Drupal by requiring non-developers to figure out Composer, esp. on e.g. Windows machines.

Which is exactly why one of the conclusions of the DDD Montpellier discussion was to create a user interface to build Composer files. In addition to that, @bojanz explained how distributions and install profiles can leverage Composer to provide great out-of-the-box experiences.

Ideally Drupal core should check before enabling a module if it comes with a
composer.json file and if the dependencies in it are satisfied. So Drupal
core would ship with Composer and invoke it to check itself before enabling
modules.

Regardless of whether we deprecate package downloads or not, doing this could prove to prevent WTFs in any case. I do have to note that simply checking Composer dependencies directly will not suffice, as modules may depend on other modules that depend on other modules that have Composer dependencies. However, as modules without Composer depenencies won't have a Composer file, we'd have to mock such behavior. Forcing the entire dependency chain check to run through Composer would make the code for such a check much simpler, though.

Removing the UI for installation is 100% the wrong way to go, especially in order to work around something for which there is more than one easy solution.

This issue has nothing to do with module installation. It is only and completely about assembling the required code base. Doing this through Composer has the additional (non BC-breaking) change that if you download a module, but do not enable it, its classes can still be used by code from enabled modules. Module installation is only required if the module itself needs application-level integration with Drupal (plugins, routes, services, etc.)

The module should use hook_requirements() to check if the dependencies exist, just as it works with Libraries right now. Only the module can know the hard requirements for the module, not Drupal.

As I already wrote in #5, writing such checks is tedious at best. Simply checking for a class's existence will not be enough to make sure the correct version of a dependency is installed. Writing Drupal- or module-specific code to check these dependencies would mean we duplicate some of Composer's code.

Seeing as this issue has been downgraded from critical, I am adding the DX (Developer Experience) tag.

bojanz’s picture

Good discussion so far.

The crux of the matter is not about Composer at all.
It is about moving from manual to automatic dependency resolution, and moving the "site assembly" outside of the web UI (a long time best practice). The question should not be "Will enforcing best practices for the first time in Drupal history cause site builders to run away?", it should be "How can we make these best practices clear and easy to follow so that we can all share the gains?".
Looking at Reddit and forums, when Drupalgeddon hit, the only devastated users were the ones who didn't use Drush, Git, or care about security. That's the result of optional and almost hidden best practices.

Clarifying:
- Why is Composer Manager just a temporary solution?
You still need to run Composer from the command line. The workflow is
1) Download module manually or via Drush 2) Run Composer to fetch dependencies
If you need to do #2 anyway, then you might as well use it to fetch the actual module as well.

- Why does Composer work on the site, and not the module level?
Because of dependency resolution. At runtime we're all one big happy family, so all other dependencies and shipped code needs to be analyzed when resolving dependencies. Otherwise you can easily end up with conflicts, duplicates, breakage. Drupal has worked around this in Contrib by saying "a library always has one module, and everyone should depend on that module", but it's still just a workaround. From a Composer perspective there is no difference between a Drupal module or a PHP library, they're both packages.

- Why can't Update Manager just run Composer?
The same reason why it can't update Drupal core, permissions.
Update Manager is conceptually wrong, if you want to assemble a set of packages, and automatically update them, you need to do it outside of the web UI and the web server user. If we provided a GUI for selecting a Drupal core, adding modules, updating it all with a click of a buttom, we'd go further than Update Manager ever went.

EDIT: And when I say GUI, I don't mean a web app. Because permissions. I mean a windows/linux/mac app.

joshtaylor’s picture

Apache/NGINX both use www-data as their users/groups, and best practices usually have you disable write permissions, hence why a GUI option isn't usually the best option...

Unless they use Windows, in which case they shouldn't have those file permission issues, so updating Composer dependencies via the WebUI shouldn't be a problem for Windows.

I believe cPanel runs (by default?) using the owner of the script, not www-data.

One way that this can be fixed is using (S)FTP to upload files, which is what Wordpress does (or a plugin, I can't remember exactly) to upload modules, just hope that dependencies can be resolved before having to run something like this, as you could have hundreds/thousands of files.

fago’s picture

EDIT: And when I say GUI, I don't mean a web app. Because permissions. I mean a windows/linux/mac app.

I'm not sure about this. When you use something like acquia dev desktop or MAMP, you'll have the execution environment for composer. And locally, you should not care about file permissions - so Drupal can encourage being installed with write permissions for being able to run composer - from the Drupal gui. Once you are done, you can upload the new code base. That perfectly fits the workflow we promote with CMI already.

The module should use hook_requirements() to check if the dependencies exist, just as it works with Libraries right now.

I think the main reason why this isn't a proper solution, is it's too complicated. It's critical that dependencies are easily exposed and consumed with hassle - only so, we can effectively bring a share and re-use mantra to contrib.

So if sharing and re-using becomes the default, modules will end up with lots of dependencies. For example, I think modules like Rules could have 10 more dependencies. No site builder is going to download that manually. And no one wants to have hundreds of hook_requirements() checks going on.

The only alternative to composer that I could envision is improving our module system to handle API modules and invent an automatic downloader that takes care of the many dependencies. But then, we'd not only re-invent the wheel and leave out non-Drupal dependencies, we'd still face the same problem of requiring code execution to build the site. So we would not gain anything compared to just using composer.

Thus, the choice contrib developers like the Commerce and Rules team face now is either
a) embrace the share and re-use mantra and require composer
b) keep re-inventing the wheel, stay in the Drupal island and try to move things into mega API library modules like CTools

Needless to say that option b) does not seem right.

I expect the following to happen:
1) We complain that we cannot simply download modules and components and place them in a directory and be done with it.
2) We think about a plan how we could work around that and still make it work.
3) Nobody does the work because we are developers and know how to use Composer and have better things to do.
4) Drupal 8 gets released.
5) Site builders have a short WTF experience when they are forced to use Composer.
6) Site builders get over it and comfortably use Composer.
7) THE END

Agreed - just like most site builders use drush already anyway. However, this will piss of non-experienced site builders who don't manage to get comfortable with composer. The only way to avoid this is to invest now in building appropriate tools - the composer GUI.

Mixologic’s picture

I'd like to discuss this part of the proposed resolution:

Remove packaged modules with Drupal 8 compatibility from drupal.org and replace them with Composer instructions.

Currently the drupal.org infrastructure is the canonical source for extensions to drupal - we've got the git repos as well as packaged tarballs, and as such we have the capability to keep track of all kinds of statistics about downloads and usage of modules and projects. If we move everything over to an external service like packagist, we'll lose some of that metadata, which may or may not be of value to the project as a whole. We also add a dependency on packagist's availability in order to build drupal sites.

I think we might want to look into the feasibility of using either satis (https://github.com/composer/satis) or perhaps sorting out a Toran Proxy (https://toranproxy.com/) .

Otoh, perhaps hosting packaging metadata is a way to get the infra "off the island" as well.

Perhaps this should be its own issue...

joshtaylor’s picture

@Mixologic, Drupal Packagist is a self hosted version (which will be hosted by drupal.org?) which means no third party will have that info :).

Mixologic’s picture

@joshtaylor - I didn't know that self hosted packagist was an option. Thats great. Keep drupal-infra in the loop if/when this is the route we go.

Mile23’s picture

It's a little hilarious that this proposal is being touted as 'off the island' when it requires that drupal-infra make a Drupal-specific copy of existing infrastructure.

But if that's the best option, then yay.

However:

Spec a behavior. Think user story.

What does a user have to do in order to install something that has a dependency?

This issue has nothing to do with module installation. It is only and completely about assembling the required code base. Doing this through Composer has the additional (non BC-breaking) change that if you download a module, but do not enable it, its classes can still be used by code from enabled modules. Module installation is only required if the module itself needs application-level integration with Drupal (plugins, routes, services, etc.)

That's an intersting concept. In that case, the error is that the module maintainer should be creating a library which can be managed through composer. For instance, ctools would be composer require drupal/ctools and then everyone could use it. And not only Drupal users.

Still: Show me the behavior you want the user to perform in order to install stuff. There isn't enough here to evaluate.

I offered a workflow in #13, which is basically the same as Libraries, except using composer to manage the libraries. What's the workflow for this proposal?

bojanz’s picture

It's a little hilarious that this proposal is being touted as 'off the island' when it requires that drupal-infra make a Drupal-specific copy of existing infrastructure.

There is 50 000 packages on Packagist currently. There is about 30 000 drupal modules. So the assumption has been that they wouldn't appreciate us increasing their index by 50% with Drupal specific code.

That's an intersting concept. In that case, the error is that the module maintainer should be creating a library which can be managed through composer. For instance, ctools would be composer require drupal/ctools and then everyone could use it

There is no difference between drupal/ctools, commerceguys/addressing, symfony/validator. They are all Composer packages.
drupal/ctools is a package that depends on drupal/core, and everyone can use it (along with drupal/core).

Still: Show me the behavior you want the user to perform in order to install stuff.

cd drupal; composer require "drupal/ctools";

Mile23’s picture

There is no difference between drupal/ctools, commerceguys/addressing, symfony/validator. They are all Composer packages.
drupal/ctools is a package that depends on drupal/core, and everyone can use it (along with drupal/core).

My point about ctools was: In D8 context, it's *wrong* that it's both a module and a library. It should be a library so that it can be required without being a module. This was a poor example for a reasonable counter to Xano's point that modules could require other modules without enabling them.

Still: Show me the behavior you want the user to perform in order to install stuff.
cd drupal; composer require "drupal/ctools";

"Modules can no longer simply be downloaded and placed in a modules directory, but have to be placed within a Drupal application to Composer."

So that's the *only* way to install a module, even one *without* dependencies?

See #7. See also #2002304: [META] Improve Drupal's use of Composer where we're actually trying to make Drupal more composer friendly.

Xano’s picture

It's a little hilarious that this proposal is being touted as 'off the island' when it requires that drupal-infra make a Drupal-specific copy of existing infrastructure.

In addition to @bojanz's explanation in #25, the second reason for setting up our own Packagist repository is so we can claim the drupal vendor, which is impossible on packagist.org.

In that case, the error is that the module maintainer should be creating a library which can be managed through composer.

I did not mean to promote it as a feature per se. I guess it could be, but you're right that ideally such code bases should be split up into modules and libraries. That is, however, not something we should go through to the effort of enforcing.

So that's the *only* way to install a module, even one *without* dependencies?

What needs to be very clear is that a module can have Composer dependencies by depending on another module that has Composer dependencies. I'm not sure we can make this change to Composer-Driven Development without removing dependency information from *.info.yml files. If we do, then Composer files become the canonical source for all dependency information (Drupal-specific metadata can remain in *.info.yml files) and we will end up in a situation where all modules are required to have the same files. TLDR; modules without dependencies will not be a problem.

I am a little concerned here that we seem to be trying to address the problem by seeing if there is a solution for non-coders first. Several people have shared very compelling technical arguments, including one about security, in favor of replacing the current dependency management solution with Composer entirely.
While efforts on building a UI are a great way to clear up any confusion and move the bigger picture forward, this remains first and foremost an issue about low-level changes.

webchick’s picture

Several people have shared very compelling technical arguments, including one about security, in favor of replacing the current dependency management solution with Composer entirely.

As yet, those arguments seem to be quite out of touch with Drupal's target audience, as well as actual data. As an example, these claims seem fairly dubious to me:

- "When Drupalgeddon hit, the only devastated users were the ones who didn't use Drush, Git, or care about security" - No, the users devastated were those who didn't update within about 7 hours of SA-CORE-2014-005. That includes Drush and Git users as well.

- "most site builders use drush already anyway" - https://web.archive.org/web/20100301104600/http://drupal.org/project/usa... shows a peak usage of 2,918 users in March 2009 (a few months after Drush became a shell script rather than a module #, after which such stats are irrelevant). There were 147,202 Drupal sites in total on that date. That's about 2%. Granted, 2015 is a lot different than 2009, but going from 2% to 51% seems incredibly unrealistic. The most-used module in the entire Drupal ecosystem (Views) has only 78% usage.

I think Mile23 has the right of it when he reminds folks that at the end of the day, Drupal is a CMS. Extending the CMS to add other functionality to it via modules is literally the foundation. Requiring site builders (technical or otherwise) to understand how to dig into the code files of modules to understand how they need to install it (use composer if it's one way, just check a checkbox if it's another) is adding an enormous amount of complexity that will leave Drupal out of reach of hundreds of thousands of its users.

I don't disagree that Composer is a great dependency management tool, nor that more and more module developers will probably adopt it. But I simply do not follow the logic of changing the entire CMS extension mechanism for site builders based on that. We already provide an API so module developers can block the installation of modules until requirements have been met, and that is hook_requirements(). Then the workflow is the same for everyone, for all modules whether they define dependencies or not, and the same as D7 as well: copy module code into the site's modules directory, go to the modules page, check the box, fix any errors (including "go download composer and run this command from the command line"), retry.

The pushback on hook_requirements() seems to be that it requires duplicating information that developers already put into composer.json, and that doing an exhaustive check for all dependencies would be an incredible pain in the ass. I hear those concerns.

But then to me, that means that we need to add helper mechanisms in the CMS to better enable Composer. For example, automatically checking Composer dependencies in system_requirements() prior to enabling the checkbox to turn a module on, having the module UI automatically download any dependencies defined in Composer (I don't understand the pushback on permissions... Update Manager is already writing to sites/all/modules, which is a non-web-writable directory). Or whatever other ideas we can think of. But not translate the lack of these helper mechanisms into a bunch of additional mental overhead to non-developers that invalidates all existing documentation.

bojanz’s picture

Requiring site builders (technical or otherwise) to understand how to dig into the code files of modules to understand how they need to install it (use composer if it's one way, just check a checkbox if it's another) is adding an enormous amount of complexity

Agreed, for me it makes sense that Composer is used to install all modules regardless of their dependencies.

EDIT: I've started adding responses to this comment, but I think I'm repeating what's already been said. Let's sit down in LA and go over this in person, it will help filter the arguments for/against.

Xano’s picture

Requiring site builders (technical or otherwise) to understand how to dig into the code files of modules to understand how they need to install it (use composer if it's one way, just check a checkbox if it's another) is adding an enormous amount of complexity that will leave Drupal out of reach of hundreds of thousands of its users.

As I've mentioned before: this issue has nothing to do with installing modules and whether or not to check a checkbox. It's about downloading modules and their dependencies. Those are two related, but fundamentally different aspects of building a site.

webchick’s picture

Yes. And I'm saying since site builders primarily interact with modules via the admin/modules UI, you can block them with hook_requirements() from installing until all requirements are downloaded. Just as any module that requires external dependencies has done since Drupal 5 or whatever.

One other thing not addressed by this workflow change (as opposed to helper functionality in the CMS for Composer) is that would mean only 8.x modules could use Composer dependencies. I would think that since this is the way PHP in general is going, we would want to allow developers to manage dependencies with Composer in 7.x as well. We could easily add pre-checking for Composer and dependency downloads as non-BC breaking features in both versions. We cannot change the installation instructions for modules in D7 4+ years after its release.

So, seriously confused about why we can't do exactly what Mile23 suggested in #13:

The proper solution is this:

  • If you install a module that has a dependency, you should get a notice in Drupal that the dependency is still unmet and that composer is needed to meet it. This comes from hook_requirement().
  • The user then goes to the command line and types 'composer install'. Because we will have fixed D8's use of composer, this will only add the library in question.
  • If you used drush to DL the module, drush should say something like, "This module has unmet dependencies managed through composer. Do you want to install these dependencies? (y/n)"
  • And you are done.

Once the necessary plumbing is there, works in any Drupal version, no?

stefan.r’s picture

I haven't been following this issue so these may be dumb remarks (and may have already been considered), but if a "composer-enabled" Drupal core doesn't end up happening, couldn't this be dealt with in Drush and d.org packaging instead?

The drupal.org module packaging script could then just do a composer-enabled "drush dl". We already do a "drush make" for distribution packages and when I asked @drumm about allowing whitelisted libraries not just in distribution makefiles but also in modules I was told it shouln't be a big technical challenge. Conceptually composer.json files aren't hugely different to drush makefiles, just that we're talking about potentially a larger number of libraries than whatever is on our whitelist now.

Also, I don't know enough to provide a workable alternative but the solution in #13 doesn't sound ideal. It would still require users to go into the command line, which I think we were trying to avoid?

As to the command line/shell_exec/composer requirement: I don't know if such a thing exists, and this may have already been mentioned, but could we include a composer file processor in Drupal vendor scripts that gets rid of the whole composer requirement? Perhaps we can run a crippled version of Composer from Drupal directly, without the command-line composer requirement?

klausi’s picture

no, you cannot simply download packaged modules + components because Composer needs to rewrite the autoloading files.

This has come up a couple of times now, we need to update the issue summary from the discussion so far.

stefan.r’s picture

Ha, @klausi I should have read your and bojanz's comments here, indeed this had come up before. I'm not sure that there's really "no possible workaround" to the autoload problem, but looking at how composer generates autoload files, likely any workaround would get ugly and would be better dealt with in core anyway :/

Also Embedded Composer is probably the non-command line Composer I thought of.

Crell’s picture

klausi pointed me to this issue via Twitter. Putting on my ambassadorial hat for a moment...

As usual, there are a couple of inter-related issues floating about here together, which makes the discussion more difficult.

The following are, I think, the indisputable facts:

* The PHP world at large had adopted Composer as THE package management system. Many frameworks (eg, Symfony, Silex, Laravel, and assorted others) use Composer as the primary preferred way to install not only extensions but the framework itself. (That is not universal, however.)

* There will be Drupal 8 modules that make use of Composer. What percentage of them is unclear, but enough big-names will do so that nearly all site builders will at some point need to use a Composer-leveraging module.

* Drupal 8 core's method of using Composer right now is sucky and broken. Modules that want to use Packagist-provided libraries are currently second-class citizens to Drupal-only modules.

That "second class citizen" problem is the one that needs to be solved.

Slightly more subjective:

* "Drupal is a CMS, so framework logic doesn't apply." Except that Drupal prides itself on being a framework/application hybrid thingie. That means framework logic *does* apply, at least in part, because to a module developer, Drupal *is* a framework. So ignoring what the bulk of the broader PHP-writing community is doing is counter-productive and hinders all the bridge building we've tried to do in the last 4-ish years. If we want more PHP developers to embrace Drupal, we need to do so on their terms.

* At the same time, though, Angie's correct that a huge part of Drupal's sales pitch is "you don't have to be a developer to get things done". Adding "but you have to use a developer tool you probably don't have installed" to that rather undermines the sales pitch...

* On yet another hand, though, the idea that "our target audience is people who don't know what a command line is" is selling our users short by a wide margin. I've always felt that we have a mythical "IQ of 80 click and drool" user in mind as the typical user, but I have never once seen any actual data to back that up. My gut feeling is that we've lost most of those to Wordpress years ago and aren't getting them back any time soon. So really, we're all guessing.

Now, there's two closely-related but distinct issues here, as Miles23 noted: Downloading and enabling. I'll tackle the second first.

To cut to the chase, a module that has unfulfilled Composer-dependencies should be uninstallable just as much as one that has unfulfilled Drupal dependencies. The logic for that is going to be the same for all modules, so like Angie said it's stupid to make each module implement that themselves in hook_requirements(). That's core's job, core should take care of keeping the checkbox uncheckable. Someone file an issue and link to it, please. :-)

Downloading is the tricky part, because if a module uses Composer then downloading is a *two-step process*, which most modules have historically not been. You need to download the module and put it somewhere, and then *also* run composer on the command line to download its dependencies and regenerate the autoload files. That is, the download process *must* include writing PHP code to disk. There is no way around that.

The logic of the argument is then:

1) Even if we let people install modules from the UI, they will often not be able to download them without going to the command line.
2) If they're going to have to go to the command line anyway, we may as well put both tasks (download and install) there so they don't have to switch back and forth. (Side note: This is where Console in core would be super helpful.)
3) That means sometimes users will be able to do everything from the UI, and other times will need to use the command line. Or rather, the command line will always work, the UI will sometimes work.
4) Forcing users to learn 2 mechanisms sucks. So if one always works, and one sometimes works, just use the always-works always.
5) We could eliminate a lot of code if we outsourced that dependency stuff to Composer entirely. Less code for us to maintain++. (And it would also let us have a single autoloader from Composer for all modules, period, and that simplifies bootstrap a bit too.)

Which is a completely valid and rational logic chain to follow; in fact, its one and only flaw is the "don't have to be a developer" part. If it weren't for that, I'd say the argument is settled and we outsource everything to Composer and be done with it, as that is the best possible approach... for people who are cool with running 3-4 lines on the command line.

Unfortunately, that's a rather big flaw given Drupal's market position, and I'm fairly confident that even though I would be on board with it myself our Product Owners (Dries and Angie) would never sign off on it. If we were OK with "CLI required to download anything", then the above logic would be case-closed and we move on. But, I don't think we are.

Which leaves us one of the following options:

1) Some modules can only be downloaded and made installable from the command line, and there may or may not be a "good" way to do so. (IE, one that doesn't qualify as "hacking core".) This is the "second class citizens" option.
2) We build some sort of fancy-pants way to trigger a composer install/update from the GUI securely. The kind of people that are in this thread ignore it and use the cli (much as they use drush now), while the "what's a CLI?" user can trigger the secondary download of composer libraries and autoload regeneration from the GUI.

Now, that second option sounds great, until you realize that it's also really really hard to do securely. We're talking about rewriting one of the most important PHP files in the system from the GUI; one that, if compromised by an attacker, is keys to the kingdom on a silver platter. One could argue that we already have a mechanism for that with the downloader system. However, that is fragile and requires PHP to SSH into itself. That requires PHP PECL extensions that are, guess what, not normally installed! That is, it doesn't work at all for *a whole lot of systems*. And the people who would be able to install the PECL module are a subset of those who could just use the CLI in the first place. Taking this route would by inevitably leave out a whole lot of people that it would NOT work for, because their server doesn't support it. (That's before we even get into the poor UI, unless it's improved since Drupal 7. I've not checked.)

However, there's one point that needs to be made explicit: *Any* such "GUI for Composer update" needs to be built strictly on top of a solid foundation. That solid foundation is vanilla composer. That is, step one for implementing this GUI option is... to Composer-all-the-things as described above. Let Composer do all of the things its good at; all we'd be doing is adding a GUI to run Composer commands and call it a day; Composer would download Drupal modules and 3rd party libraries alike. To try and dance around that approach would be, to be blunt, grotesquely stupid.

Which leads me to the conclusion that IF we are willing to put the resources into it, our least-bad option is:
1) Composer-all-the-things; all downloads go through Composer, and we should look into Embedded Composer to eliminate the extra download. (But we should still allow people to use the composer on their system.)
2) Build the necessary plumbing to enable a loopback-ssh (not FTP; that wouldn't work anymore, and is insecure anyway) execution of composer. NOT of uploading, not of downloading, just of executing composer. Nothing more. That probably would obsolete the current Updater entirely.
3) Accept that a significant percentage of users would not be able to leverage option 2 and would have to use the CLI to download modules, including Drupal-only modules. (Since step 1 is putting Composer in charge of autoloading those.)

Although, now that I think about it, that last one brings up an issue that no one else has mentioned, but is problematic: we don't want an uninstalled module's classes in the autoload list, but installing the module is not a step that should be modifying code. Poopy. I do not know what to do on that front. Suggestions?

There's one other point here, which is Packagist. This comment is long enough as is so I will cut to the chase: Dear god do NOT implement our own parallel Packagist! No! DO NOT WANT! It's a terrible waste of resources, as well as politically stupid. For one, it would mean adding extra information to every (many?) composer.json files to tell Composer, hey, this module is not on the normal Packagist. For another, Laravel got a lot of blowback when they introduced their own Packagist-extensions site, rather than putting resources into improving the "Commons" of Packagist.org. Let's not do that. We're trying to become less isolationist, not more.

Instead, determine what we would spend maintaining that part of our infrastructure for a year. Take half that amount and dedicate it toward improving Packagist in ways that would be helpful to us. Would it be useful to allow an organization to "claim" a vendor space? Let's talk to Jordi and do it. I would be more than happy to make introductions between Jordi and any DA developer needed.

That also means we could address the "this is a lot of work" problem above by calling it a Drupal.org improvement, and having DA staff work on it. Sure, patches still go through the normal workflow but there's our resources. And it's an investment in saving the DA money in the long run.

We don't have any win-win options here. We're going to have to say "No" to someone, and accept we're making their lives harder. I believe the above is likely the one that says that to the fewest people, but I can't prove or disprove that.

Mile23’s picture

Dear god do NOT implement our own parallel Packagist! No! DO NOT WANT! It's a terrible waste of resources, as well as politically stupid.

Just want to zero on on this. :-)

Also, @klausi says:

no, you cannot simply download packaged modules + components because Composer needs to rewrite the autoloading files.

Composer needs to dumpautoload, yes. This means that vendor/ needs to be writable. That's the only requirement. It's a reasonable requirement if we're using Composer to manage stuff. See: #2380389: Use a single vendor directory in the root

So let's get back to behaviors:

IF we are using Composer to manage MODULES, then we are going to need to do the following:

  1. Either Drupal itself or any module that wants to be installable this way will require composer/installers. (Why? Because.)
  2. Modules will declare themselves of type module in their composer.json.
  3. Users will say things like composer require drupal/project-examples.
  4. Users will then fire up their Drupal and install the module, or say drush en module.
  5. And we're done.

IF we are only going to use Composer to manage DEPENDENCIES, much as we use Libraries in D7, we need to do the following:

  1. Modules declare their dependencies in composer.json.
  2. hook_requirements() is used per module to determine whether the module can be installed.
  3. Alternately, we add a system which allows Drupal to discover whether the dependency is met already.
  4. The user sees an alert in Drupal that they should run composer install if the dependency is not met.
  5. The user runs composer install.
  6. The user then installs the module.

Note that both of these options are very similar, have significant overlap, are not mutually-exclusive, and both rely on fixing D8's composer implementation. For a start: #2380389: Use a single vendor directory in the root Note that this is an easy fix. Super easy. Easiest thing in the world. Just full of yak-shaving goodness. And once the yak is shaved, we'll be Happy And Productive Members Of The Larger PHP Community. Yay!

Note also that if we *can't* make those super easy things work, then it's not worth the trouble to even start making a patch to implement Drupal auto-checking composer-based dependencies. Many have tried and failed already, which is why we have this issue in the first place.

aneek’s picture

So till this is decided, if I want to develop a custom module that requires external PHP library I have to do it in following way?

  1. Create a composer.json in my module's directory.
  2. To use this, I have to run composer install in my module's directory.
  3. Installing the packages will create a vendor directory in my module's directory and I have to require_once the vendor's autoload.php to get the package classes.

Limitations:
While installing this module the user have to use composer and this module can't be installed via Drupal's module installation UI.
Will Drush commands automatically install the dependencies? -- I hope not. (Please, correct me if I'm wrong)

I was seeing an similar example with "Drupal Module Upgrader(drupalmoduleupgrader)". This module also depends on various PHP libs and includes the autoload.php the same way I mentioned earlier. But is this a proper way as of now? Coz, there is no way my module can tell while installing itself if the php lib is available?
Maybe,

hook_requirement() {
 if(!file_exists('/path/to/lib/autoload.php') {
   throw error
 }
}

So any suggestions? -- Thanks!

Xano’s picture

@aneek Please create another issue for that, so we don't derail this one (your question is related, but slightly different than the topic of this issue). See Composer Manager as well, or just run composer require ... in your Drupal root, which how some other modules will require to be installed.

aneek’s picture

@Xano, thanks. I think I better consult with others in IRC before filing a new issue. Though there will be hundreds of module that would love to have a functionality to check for the dependency automatically (maybe, much like Composer Manager does). It will provide a great UX in future for the module developers as well as the site builders.

Crell’s picture

Looks like Packagist now supports owned vendors: https://github.com/composer/packagist/issues/163#issuecomment-99673878

Yay, Jordi!

Mixologic’s picture

re #40: thats great. So how do we claim the drupal namespace, and who is going to own that for the time being until we get this all worked out? Shall the Association create a drupal.org account that owns that namespace?

I've got some other, infrastructure/security specific concerns, and didn't want to pollute this issue with a slew of tangential questions, so I opened another issue in the infra queue: #2485011: Infrastructure Requirements to support Composer based workflows

Crell’s picture

Before we get to that, there's still the very open question of how we resolve the "dev vs. downloader" question above. That is THE sticking point and always has been. Don't get too far ahead of yourself. :-)

Mile23’s picture

@crell: As it turns out, 'devs vs downloaders' isn't really a dichotomy. Everyone can have their cake and know its deliciousness.

However: Until this test passes, we can't really move forward on anything here: https://travis-ci.org/paul-m/d8-drupal-require

Crell’s picture

Miles23: Did you see webchick's comments? Or my lengthy #35? I don't see how the cake is not a lie...

klausi’s picture

@Mixologic: if I understand it correctly then anybody having a drupal package on packagist.org is now automatically a maintainer of the "drupal" namespace there. That includes me, Dries and many other people, see https://packagist.org/search/?q=drupal

webchick’s picture

That essentially makes our project application review process go right out the window, eh?

Crell’s picture

webchick: Well, not necessarily. If we require that only projects that originate on Drupal.org can use a drupal/ vendor namespace on Packagist (we probably could manage that), and only push projects there if they are promoted and not sandboxed, then we'd still have that gate control.

We couldn't stop someone from using a non-drupal/ vendor namespace on Packagist, but that wouldn't break anything since the code namespace is what matters, and that doesn't have to be the same as the Packagist namespace. But then, nothing stops someone from downloading a tarball snapshot from GitHub today for a module, and people already do that.

What it would mean, true, is that it's not harder for someone to composer require crell/some-module than it is for them to composer require drupal/some-module, whereas right now Drush dl requires extra work to handle off-site modules. But to be honest, I think that's a battle we are doomed to lose eventually. Instead of making it harder to be on not-Drupal.org, we should be making it more advantageous to be on Drupal.org.

Mile23’s picture

Webchick: That essentially makes our project application review process go right out the window, eh?

...

If we require that only projects that originate on Drupal.org can use a drupal/ vendor namespace on Packagist (we probably could manage that), and only push projects there if they are promoted and not sandboxed, then we'd still have that gate control.

Initially I thought thumbs-down on that, but I evolved to thumbs-up, as long as sandboxes can serve other vendor namespaces. Like mile23/netbeansdrupalcomposed or something. :-)

So what would have to happen is: Somewhere in the Project system someone will have to make Drupal talk composer.json enough to verify these things, and mark projects as crawlable/not-crawlable by packagist.

Is that doable?

bojanz’s picture

Mile23’s picture

I'll be in a tent in an ancient forest on the 12th.

Regarding that BoF description: I'm against *mandating* composer for modules, but I also think people will figure out that once Drupal starts doing it right, all the use cases will be satisfied.

fago’s picture

There's one other point here, which is Packagist. This comment is long enough as is so I will cut to the chase: Dear god do NOT implement our own parallel Packagist! No! DO NOT WANT! It's a terrible waste of resources, as well as politically stupid.

I agree that our resources a probably better used by making packagist work for us also instead of making our own version. However:

For one, it would mean adding extra information to every (many?) composer.json files to tell Composer, hey, this module is not on the normal Packagist.

It's actually a rather simple statement that you add once to your project's composer.json and you are done. I don't the packages/modules would have to care.

For another, Laravel got a lot of blowback when they introduced their own Packagist-extensions site, rather than putting resources into improving the "Commons" of Packagist.org. Let's not do that. We're trying to become less isolationist, not more.

Imo, having our own packagist would be a huge improvement and not a big hurdle at all. Also, we would not be isolated as everyone could just use the Drupal packagist as source also - we are using the same tools after all.
Thus, if it turns out to be simpler to roll our own packagist for whatever reason - I'd not see a problem with that.

Still, there is the vendor claiming issue. Even if we'd have our own packagist, we probably want to secure the drupal prefix on packagist as those packages could sneak in else...

BOF created: https://events.drupal.org/losangeles2015/bofs/composer-and-drupal-8

Great! As I'm not in LA I won't be able to join :-/ Would be great if someone could provide a summary of the discussion afterwards.

Mixologic’s picture

Darn. That Bof is right during the drupalci presentation.

Xano’s picture

One of the issues related to Composer-driven development is how to include local modifications of Composer packages in a Drupal site for testing application-level integration. The reason this is more complex than unit-testing, is that the package must be located in a particular subdirectory (because the package is a module), which can only be done using composer/installers, which is a dependency of Drupal's ./composer.json.

Modules must be added as dependencies to this Composer file for them to be put in the correct directory and their dependencies to be resolved properly. However, by doing this Composer will include the module by creating a copy of the code from the module's repository, whether that repository is online or local. It will not include local modifications that have not been committed yet, which is what you'd like while developing that module.

I found this issue for Composer with a discussion about how a proper development workflow can be set up.

Another reason why Composer Manager is not an ideal solution is that it requires a installed and bootstrapped Drupal site to operate, whereas Composer itself can be used without a working application. This means that test builds can't fail early by running unit tests before setting up the environment for integration tests, because the environment must first be built to assemble the code base.

Crell’s picture

#53: I'm assuming that custom modules for a site will be checked into the main repo, not re-pulled on each checkout.

I spoke with Jordi about integrating Drupal.org into Packagist, and it sounds surprisingly easy.

1) We need to get ownership over the drupal/ vendor. Right now whoever registered a current drupal/* package has that; we need to clean that up and turn the account(s) over to the DA infra team.

2) We send an HTTP POST request to Packagist.org's API with the repo to update, once after each repo push, new tag, or whatever non-cron trigger we feel is appropriate. Packagist then does its thing. See https://packagist.org/about under the "Update schedule" section.

He also felt that Packagist should be able to handle the load of adding Drupal modules, as long as we aren't hammering the site with cron on a regular basis. So as long as we're well-behaved it should be really straightforward to write the code to push composer.json-containing projects to Packagist.

The caveat, of course, is that we MUST use a semver-friendly versioning scheme for contrib modules or Composer won't know what to do with it. See #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc)

Mixologic’s picture

How would a site check for updates (both security related and feature updates) if it were using composer, for both drupal modules and included code from other sources?
For reference, Drupal.org had 46,835,160 hits to updates.drupal.org yesterday - granted that could probably be reduced to < 1 million a day due to how the updates work, but thats still a rushing firehose of traffic.

deviantintegral’s picture

In Composer Manager we have a small wrapper module around sensiolabs/security-checker which works reasonably well. It doesn't speak at all towards their security policies, the distribution of SAs, or pure feature updates, but perhaps that library is a good place to start.

Crell’s picture

I also heard back from Michael Babker of Joomla. According to him, Joomla if anything is even worse off in this regard than we are currently; they're checking in composer libraries to their repo, but not the composer.json or composer.lock. That means individual sites can't do much of anything. Also, they have no equivalent of composer_manager so ever extension is on its own to do the same. Essentially, "yeah, sorry, no composer for you!"

So our current working plan still sounds better.

I also spoke with Michael Cullom of PHPBB, who is trying to figure this out as well. He has no solution for the security issue either.

Once again, I think the plan discussed in LA is the best available at this time.

Mixologic’s picture

"Once again, I think the plan discussed in LA is the best available at this time." - could somebody summarize that plan?

My main concern is that currently, if you are a drupal site owner, your site is connected to a service that provides a reliable security status for your site. I assume that we are not planning on regressing that feature.

In order to move to composer built sites, and preserve that feature, we need to have a way that sites can access all of that data in aggregate, so they can be alerted about vulnerabilities in core, modules, and all of the dependent libraries.

So, either drupal.org can provide that feature and behind the scenes we coordinate with other services like https://security.sensiolabs.org, or we do all of our security reporting to those services, and drupal core's updates/security service is wired to them. https://security.sensiolabs.org/stats is a little bit unsettling because it shows that A. their service might not be as reliable as what we have now (the big negative spike - though could just be a reporting blip), and B. might not be as robust as we have now - 280,000 checks per week at their peak is at least an order of magnitude smaller than the traffic they'd get from d8 sites 6-12 months after d8 is released (I hope).

Crell’s picture

The Framework Interoperability Group is currently discussing a pair of PSRs, 9 and 10, which are a standard security disclosure policy and a standard SA format, respectively. The idea of the latter is to allow anyone to build a service like security.sensiolabs.org, likely building on data provided via Composer. If you're interested in that conversation, join the FIG mailing list. :-) (There is no timeline on either of those PSRs, of course.)

For the time being, I don't have a problem with Drupal.org still being the SOA for update and security awareness for Drupal modules specifically. We can let this evolve, and change course gradually once the broader industry gets some of these tools in place. In fact, we can help drive some of those tools as long as we do them in a Drupal-agnostic way.

bojanz’s picture

Here's a summary of the BoF. Apologies for the delay, I blame travel & laptop death.

Attendees: alexpott, xjm, webchick, Dave Reid, Crell, socketwench, slashrsm, webflo, dwkitchen, Chris Weber, others (sorry if I forgot to list you).

What was discussed and decided:
1) We want to remove core/vendor from the git repository, and add it to the tarball during packaging.
We need the same packaging capabilities for distributions too, since most distributions will have composer dependencies. Ideally this would happen before 8.0, and be funded by the DA. Unfortunately I haven't had time to contact Mixologic during the conference to discuss this.
Issue: #1475510: Remove external dependencies from the core repo and let Composer manage the dependencies instead.

2) We let update manager die. It's already unloved, and on its deathbed, and there's no way to make it work with modules that require Composer.
We went over a hundred "but what if" ideas about making it work with Composer, all were shot down.
(One included requiring hosts to install a php extension that would allow update_manager to ssh in)
Issue: #2352637: Remove the UI for installing/updating modules from update module if it is not fixed in time for release.

3) We create a bridge from Drupal.org to Packagist so that every module with a composer.json file gets a matching Packagist entry.
The initial idea was to use Drupal Packagist because of the sheer number of entries (contrib represents 25% of the current packagist size) and traffic we'd be sending their way. Crell feels strongly that we should use Packagist instead, and contact Jordi to help us integrate it. His reply in #54 is an update on that.

Note that every module will want to have a composer.json, even if it doesn't have Composer dependencies, in order to allow it to be a dependency of another module. We don't need to enforce this, contrib will make the change organically.

4) We recognize that there is no avoiding Composer in contrib.
Many contrib modules will require installation via Composer only, by hiding releases from their d.o page and displaying a "composer require drupal/module" line instead. This is especially relevant to big contribs such as Media, Commerce, Rules, etc. It is possible that all top 30 contrib modules will require this, and that every site builder will need to use Composer at least once during the site building process.

Crell was in favor of ripping off the bandaid and requiring all modules to be installed via Composer only, but the final consensus was that we should let this happen organically. That way we can recognize reality instead of anticipating it, since D8 at this point doesn't leave us much space for bravery.

5) We need to solve the $root/vendor VS $root/core/vendor situation. The introduction of autoload.php was a poor bandaid.
The initial agreement was to move core/vendor back to root, as proposed in #2380389: Use a single vendor directory in the root.
However, this creates a workflow problem:
- Site builder installs Drupal, $root/vendor has core dependencies.
- Site builder uses Composer to install Commerce, $root/vendor now has both core and Commerce dependencies.
- Site builder hands off the site, and it's deployed to production.
- Drupal core update is released, someone else manually updates core, overwriting vendor/ with the one from the tarball.
- Commerce (and thus the site) breaks since vendor/ no longer contains Commerce dependencies.

This is fixable by rerunning "composer update", but it means that once you use Composer for the first time, you must continue using it for core updates too, manual core updates are no longer possible. Both webchick and alexpott strongly believe that it's not possible to enforce this, and that it will lead to broken sites and/or avoiding core updates altogether. As an alternative I've started exploring the "embedded composer" approach of using $root/vendor only for contrib dependencies in #2489646: Start using $root/vendor for non-core dependencies only. It's non-trivial, but possible.

Other things discussed:
- We clarified that it's not possible for each module to ship with its own vendor/
- Alex proposed creating a d.o "distro builder" that allows you to list the modules that you need, and get a tarball with packaged dependencies.
In addition, the system could remember your configuration so that when you want to go back and add another module, you don't need to start from scratch.
This was left undecided because it would be a big drupal.org change / DA investment, without completely solving our Composer troubles.
It is perfectly possible to explore this outside of d.o, of course.
- I proposed creating a GUI (acquia dev desktop-like) frontend for Composer. People felt that a web based tool such as the one Alex proposed made more sense.

Feel free to post corrections. This was a very tiring 2+ hour discussion, so I might have gotten some details wrong.

Conclusion: We recognize that Composer is the present and the future. We also recognize that requiring site builders to use a command line tool is problematic.
We want to make sure core allows a Composer flow, and then see where the future leads us. However, making Composer optional still leaves us with a few problems (the vendor/ VS manual core updates issue).

Xano’s picture

This is fixable by rerunning "composer update", but it means that once you use Composer for the first time, you must continue using it for core updates too, manual core updates are no longer possible.

After you've used Composer once, manual updates should be possible again after you've stopped using any contributed modules that require Composer, or am I missing something here?

bojanz’s picture

Well sure, but the assumption here is that you want to keep using the modules you had, while still performing a manual core update.
Perhaps someone else did the site building and you don't even know how the module got there (whether Composer was used or not).

davidwbarratt’s picture

I'm sad that I missed the BOF, but I gave a presentation on most of the issues that it looks like were covered in the BOF.

https://youtu.be/JNhFSvaM1Zo

NOTE: I misunderstood one of the questions, which was about being notified of security updates. The only thing I'm currently aware of is Sensio's Security Check, it might be worth learning what they do or integrating with their service. I wonder if it would be best to have a web service on d.o. that would take a composer.lock file and return which modules need to be updated.

Xano’s picture

Well sure, but the assumption here is that you want to keep using the modules you had, while still performing a manual core update.

I'm inclined to say that this is an either/or situation: you either maintain your code base by manually downloading packages (only possible without Composer dependencies), or you switch to Composer entirely. I don't think anyone can expect a system to be maintained using different workflows for two parts of the system (core and contrib), while both parts have Composer dependencies that will need to be maintained and kept compatible.

Perhaps someone else did the site building and you don't even know how the module got there (whether Composer was used or not).

A valid concern. While decisions like these should be documented for projects, the fact that quite a few, if not all, of the most popular contributed modules will end up having Composer dependencies will eventually make people think about this automatically.
Many projects contain patches that need to be applied after performing updates or even after every single build. Libraries need to be downloaded and placed in the correct directories, etc. Such actions can be automated, but, like running composer update, should be thoroughly documented for projects that require deployments and updates to be performed manually.

webchick’s picture

Thanks for the summary, bojanz. It looks accurate except for #2. I definitely remember us deciding Update Manager wouldn't be able to eliminate the need for site builders running Composer from the command-line on modules that required it, unless some sort of SSH PECL extension was installed on the host (which is rare). I don't however remember consensus for ripping it out as a means to download modules.

Another aspect missing is that I thought? we agreed that we should bake some smarts into system_requirements so that if a module has a composer.json, *and* it specifies external dependencies (ones that don't start with "drupal/") it would prevent the module from being installed, with some sort of error message that points people to how to download/run composer. This means we don't need to create a "two-tier" system where some modules have tarballs (and thus all the benefits that come with that, like download tracking, usage stats, security update notifications, etc.) and some only have a copy/paste command line snippet. And we also then would not need to force every module developer who wants to use Composer to write tedious lines in their own hook_requirements() to check for missing dependencies.

I don't think an issue for that one exists yet.

yched’s picture

+1 on considering that we can't support a tarball download of core on top of an install that used composer earlier on.

What if core tarballs included a marker file, composer update added a different marker file, and we just throw an error at runtime if both files exist ?

Xano’s picture

I don't think an issue for that one exists yet.

I didn't think so either, so I opened #2494073: Prevent modules which have unmet Composer dependencies from being installed. Anyone is free to give that issue a bit more direction in the summary.

Crell’s picture

Thanks, bojanz. One minor correction: I wasn't as much in favor of forcing composer-all-the-things right now as I was explicit that we're going to end up there sooner or later one way or another, and we need to accept that. The mechanics of how we accept that are the important part, and I'm on board with the "organic as contrib pushes it" approach we ended up with.

webchick: Yes, we did say that core should autocheck for composer.json requirements, because making everyone do that themselves is lame. (I just commented on that in the new thread Xano opened.)

However, I thought we had said that longer term, we *would* ask the DA to modify project pages so that modules that have a 3rd-party composer dependency would automatically show the necessary composer require line rather than the tarball, since the tarball wouldn't be useful anyway. That's not a release blocker, but a normal feature request for d.o.

bojanz’s picture

@webchick
Thank you for the reminder about hook_requirements(), I forgot about that.

The update manager notes were merged with a later discussion with Alex, where we discussed the current unmaintained status of the module.
I'm fine with composer-requiring modules (contrib top 20 or whatever) simply not being installable via update manager.
It's another example of "letting things happen organically".

However, I thought we had said that longer term, we *would* ask the DA to modify project pages so that modules that have a 3rd-party composer dependency would automatically show the necessary composer require line rather than the tarball, since the tarball wouldn't be useful anyway. That's not a release blocker, but a normal feature request for d.o.

At the BoF it was said that it's fine if modules simply do this manually (disable releases, update the project description). I agree that in the long term a bit of automatization and styling would help.

andypost’s picture

It's still not clean what to do with autoload files?!
Once composer downloads it updates autoload but module is not enabled at the stage.
So as @Crell mentioned we end up having a lot of garbage in autoloader

Fabianx’s picture

Disclaimer: I love composer, I like it and use it frequently for my own projects. I just don't think Drupal should integrate with composer (be dependent, rely on, change everything over), but instead that we should find a way to integrate composer into Drupal - cleanly.

Point 1:

So composer is just a PHP script / library, why can't it spit out the files to install and then we e.g. FTP them over?

Yes, currently we use composer as a command, but lets not forget that it is also a library, which allows do to way more than the script.

==

Overall:

Either you have write / read access to your server some way (direct write, FTP, ...) (composer "web" possible) or you don't.

You usually have own read-access to your PHP files / composer.json.

Could we stop treating composer as some kind of 'magic' thing, please?

It is not, it is a simple dependency manager. (similar to how e.g. apt-get worked since ages already).

Conclusion:

If we managed to install a tarball via UI, we can as well manage to install some libraries via UI, possibly need to use GitHub's / D.orgs tarball services or another party that transforms a git checkout into a tarball (packagist.org itself maybe?), but it is possible.

==

The workflow for modules installation should IMHO be:

- Download module
- Resolve dependencies
- Install module

==

Resolve dependencies is currently hardcoded to checking if other modules it depends on do exist, but make that part pluggable and suddenly composer requirements fall into place naturally.

- 1. Make module dependency system extensible

=>

When trying to install a module (on the screen) we check for those dependencies, the composer check fails, the user gets an error about missing libraries.

They could now run composer to install those libraries. (a simple composer install should be enough - and yes, I see how this is complicated with how composer works currently)

or with "composer web" we could just retrieve the list of files to install and update the root composer.json and FTP the files over (or something).

--

The module is installable then and can be installed normally.

Point 2:

I have a strong -1 to the proposal to make drupal.org only show a link with the require composer link and the reason is:

- patches

We usually don't patch vendor/, but we frequently did with custom and contrib/ modules.

Also we want to _lock_ one version into code and not update things just because composer update feels like it.
(This has been a huge pain in the past and a reason why an enterprise client I know banned composer for non-dev things.)

This is still the #1 reason not to use git submodules for contrib/ modules and I don't think composer would change this.

While there are some agencies using a git submodule workflow, most that I know still commit modules statically to the code base and that is a good thing overall.

Or lets phrase it like that:

Committing things to the code base directly is a valid use-case we need to support, too.

--

I am pretty positive it is possible to get composer to work with modules cleanly overall, too.

What about a dynamically generated composer.json in e.g. sites/default/files/composer/ (yes, I know security, phpstorage, etc. - just an example) that we reference from the root one:

e.g.

we currently have "drupal/core", why not "drupal/custom" and "drupal/contrib", which reference to:

/composer/contrib.json
/composer/custom.json

and is automatically populated by Drupal / a Drupal extension from the modules information. Any module found on the filesystem in the right folder, e.g. modules/contrib/ modules/custom is added there and referenced correctly ...

Due to how dependency chaining works, a simple "composer install" is all you need then.

Proposed steps:

- Put module in

- Run file scan
- Dynamic composer.json is updated (same as we had a system table before / now have system k/v store entry)

=> new step: - Run composer install / web installation

- Install module

====

TL;DR: I don't think as much is needed to support composer and core. I don't think supporting update manager is impossible either, all we need is a tar ball from a git tag and we are at status quo. (remote composer tarball generator)

I don't believe not supporting tarballs from drupal.org is the way to go, but more integrating it properly is key instead.

The advantage of having both static modules support and composer modules (those would live in vendor/ and be excluded by the file scan for the dependencies) live happily together is that it allows both workflows (static modules, added dependencies) be equally supported.

Regardless if contrib/ chooses to rely on composer or not.

Also no or just very little infrastructure changes are needed in the best case.

Conclusion:

Drupal.org and Drupal should not need to change (much) to support composer. Composer should change to support Drupal ;) (just kidding, but I hope you get the point ...). Or rather we should change our system to support both.

Actionable steps from above for sub-issue:

- 1. Make module dependency system extensible

Regardless if core or contrib/ disallowing module installation when composer.json dependencies are unfulfilled is step 1. Given core uses composer, I vote to do this in core directly.

benjy’s picture

I think there are some really valid points in #71, just wanted to respond to these two points:

I have a strong -1 to the proposal to make drupal.org only show a link with the require composer link and the reason is:
- patches
We usually don't patch vendor/, but we frequently did with custom and contrib/ modules.

There are solutions to this, here's one: https://github.com/netresearch/composer-patches-plugin which is somewhat similar to drush makes solution.

Also we want to _lock_ one version into code and not update things just because composer update feels like it.
(This has been a huge pain in the past and a reason why an enterprise client I know banned composer for non-dev things.)

We can already do this by specifying the exact commit or version tag in the composer file?

Fabianx’s picture

#72:

1. Thanks for the information, I might consider that for my own projects :). This is still very much similar to git submodules usage - workflow wise, however.

2.

No we want to lock that module and libraries in code into the git repo for e.g. security reasons. Dynamic checkouts from github do not play well with security reviewed / audited code (Read: It is a no-go for at least some enterprise clients).

And yes obviously a .lock file helps and you can commit vendor/, too, however the likelihood of having to change a module is much larger than having to patch a library at the moment. (from my experience). So being able to keep modules/* for at least Drupal 8 (can re-consider in Drupal 9) would be good I think.

Crell’s picture

Keeping Drupal modules in modules/* is the plan. We're just *enabling* them to be installed via composer, for which there is a composer plugin that puts them there instead of in vendor/.

As for the rest, it boils down to this:

1) Adding a composer-based library requires rewriting a PHP file on disk. That file is also needed before we have access to encrypted storage or anything else, and is very performance sensitive.
2) Doing that from an apache/nginx process is an unacceptable security hole.
3) Doing that from an FTP/SSH loopback requires a server that has an FTP or SSH pecl extension available. That is a minority of servers. It also requires the user to re-enter shell credentials in the browser (which is of debatable security).
4) Thus, the majority of servers cannot use an FTP/SSH loopback.
5) Thus, modules that have composer dependencies cannot resolve those dependencies freom an apache/nginx process.
6) Thus, we can either make that subset of modules second-class citizens (composer_manager etc.) or we can just accept that they're only useful to people with shell access.

The conclusion reached at the BoF is that we'd rather do option 2. (Non-composer-depending modules are unaffected and can continue to be installed via drag-and-drop.) But that decision point is unavoidable unless you can come up with a secure way to run composer from the web process and generate PHP code that works on 90%+ of servers in the wild securely. (If you can, please share because all of Drupal, Joomla, and PHPBB would love you for it. Really!) Without that, this is the best option we have available.

David_Rothstein’s picture

I don't understand why Composer would make the security considerations behind the Update Manager any different than they currently are.

By design, the Update Manager lets people write (and rewrite) PHP files on disk via the admin UI. Once you can do that, you can take over the site anyway. Why would running Composer from inside the Update Manager be any less secure?

If you think the Update Manager needs to perform stricter security checks than it currently does (before allowing people to write files from the admin UI), that sounds like a separate issue, unrelated to Composer. (See also: #932110: On some servers, the Update Manager allows administrators to directly execute arbitrary code even without the PHP module)

Overall, Fabianx's comment in #71 makes a lot of sense to me.

Crell’s picture

David: Because Update Manager is not actually useful in many cases. It's clunky, and to do it securely requires the pecl ssh extension on the server... which few servers have. I actually suggested in LA that we could rewrite Update Manager to simply run "ssh -L composer require drupal/yourmodule", which would securely take care of everything.

That would actually be great, because then the code path becomes WAY simpler (because Update Manager has no work to do other than that ssh command). But there's 2 issues with that:

1) We have to make Composer fully supported first, which is what we agreed to do above. Having our cake and eating it too *requires* first doing the refactoring to make "composer all the things" work correctly.

2) That would be useful only for those servers that have pecl ssh installed... and anyone that can install a pecl module also has shell access to use Composer directly and cut out the middle man.

So basically, making composer-all-the-things a first-class workflow is a prerequisite to doing just about anything else. If someone wants to then write an Update Manager GUI for it that relies on pecl ssh, great! But that's not going to cover enough of the target audience for a GUI that it's worth doing for core.

Fabianx’s picture

#76: Respectfully disagreeing here. UpdateManager works currently and often one of the main reasons people say they use Wordpress is because of the Web Install / update functionality. (Yes, still ...)

An SSH extension is not needed and not more or less secure than FTP-ing from localhost to localhost, e.g.

Crell’s picture

Fabianx: My point still stands. The only good way to do Update Manager-esque functionality is to first do "composer all the things", and then build a GUI that can trigger composer require. So we do that first part first. Then if the second part is doable, patches welcome. Making composer fully supported is the required first step.

David_Rothstein’s picture

Crell: For sites that use Update Manager with a writable docroot (which is certainly a lot of them, maybe even most) we could have it do system("composer require drupal/yourmodule") instead and it wouldn't be any less secure than what Update Manager already does on those sites today.

Actually it might be more secure, because presumably you could limit it to downloading official packages (whereas currently Update Manager allows adding completely arbitrary code via the admin interface on those sites).

The main point, though, is that several people in this thread are saying improving Composer support in core means dropping or deprecating the Update Manager, but that just isn't true. At a minimum it would be very easy to maintain the status quo. And per above, there's even potential to make the Update Manager work better than it currently does for many users (but yes, that would certainly be a followup step).

Crell’s picture

David: It's more accurate to say that improving composer support means that there will be modules that are incompatible with FTP-based module installation (via manual or Update Manager), which is just something we accept. Also, Update Manager from what I understand is barely maintained and there was talk at DCon of wanting to remove it, entirely independent of Composer questions.

As long as we get to composer as a first-class citizen, I personally do not much care what tools we build on top of it or if they're in core directly. That comes down to what resources are brought to bear on them (aka, "patches welcome!").

andypost’s picture

I just wanna to raise the autoloader question again.
All discussions are about a workflow of download and install but how autoload should be updated?
Looking at installer autoload is static file that is not changed by core - so how install profiles that ships libraries can updated it in install time?

timmillwood’s picture

Status: Active » Needs review

Sorry, I haven't gone back and read all the comments, but I do have a good understanding of the bigger composer picture in D8, and have one question.

Isn't this already possible?

The "Composer template for Drupal projects" from @webflo et al adds devel and token by default, it puts these in a web/modules/contrib directory, and works well. I am using this successfully with other modules too, as long as they have a composer.json file setting the type to "drupal-module".

For modules that don't have a composer.json I have started submitting patches:
#2514546: Required packages in composer.json
#2514620: Add composer.json
#2514716: Add composer.json
#2514574: Add composer.json
#2514764: Add composer.json
#2514654: Add composer.json
#2514602: Add composer.json
#2514596: Add composer.json
#2512816: Add composer.json

I have also written some docs advising that contrib modules have a composer.json
https://www.drupal.org/node/2514612

yched’s picture

@timmillwood : The issues you opened seem to be about modules that have no external dependency. I'm not sure we really want to ask *every* module to provide a composer.json, even if they don't have external dependencies.

For now packagist.drupal-composer.org takes care of adding the minimal composer.json if the module doesn't provide it, which AFAIK is good enough for the Composer template for Drupal projects

That might be more a topic for the "do we move modules to packagist.org / do we officially endorse packagist.drupal-composer.org ?" discussion (which I can't seem to find - maybe #2420929: Create git hook for Drupal Packagist ?). Until one of the two options happens, it seems a bit premature to state that "D8 contrib modules *should always* have a (redundant if no deps) composer.json in addition to the .info.yml" ?

webflo’s picture

I think its makes sense to add a composer.json to every module because its essential for a composed-based workflow. E.g. If you host a module with patches applied on your own infrastructure (or github).

Our migration to packagist.org is blocked on #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc), because packagist works only with SemVer.

davidwbarratt’s picture

#84,

Just a slight correction to what you said... Packagist requires an W.X.Y.Z (or X.Y.Z) version number, it does not technically require SemVer.

timmillwood’s picture

If packagist.drupal-composer.org is adding composer.json it looks as though the type is not being set to "drupal-module" because "drupal/pathauto": "8.1.*@dev" gets added to vendor/drupal/pathauto.

webflo’s picture

@timmillwood indeed, the composer.json for pathauto is messed up on packagist.drupal-composer.org i look into it.

timmillwood’s picture

do we officially endorse packagist.drupal-composer.org ?

Maybe we should? There are a few pros and cons:

Pros
- Prevents up from polluting the official packagist site
- Gives a single place to get Drupal modules (no matter where the source is)
- Give the Drupal community more control

Cons
- Webflo's hosting cost (would need to move to DA infrastructure)
- Separates us from the PHP community
- New things scare people

I am really in favour of the idea and would like to see {"type": "composer", "url": "https://packagist.drupal-composer.org"}, in the drupal/drupal composer.json, so here's a patch for that.

Mile23’s picture

Status: Needs review » Needs work

How many times are we going to have this conversation?

@crell in #35: https://www.drupal.org/node/2477789#comment-9891271

Parallel Drupal-centric packagist is a bad solution.

We should contribute to the real packagist, and we should be interoperable with other projects in the Composer-verse.

- Prevents up from polluting the official packagist site

If we're 'polluting' the real packagist then we are doing it wrong.

- Gives a single place to get Drupal modules (no matter where the source is)

That's how packagist works.

- Give the Drupal community more control

We're already dependent on Composer and many libraries out of our control.

We can own and manage the drupal/* vendor namespace on packagist.

We will also eventually have Drupal extensions that live in vendor/ such as console commands.

The problem here is not one that can be solved by creating another piece of infrastructure.

The problem here is that a) We can't decide on a desired behavior, and b) There is no way to force the conversation to end, other than to release Drupal in a broken state.

That's why I made this issue: #2432893: Make Compser DrupalWTF Go Away

There's also this issue, which has some travis-ci behavior tests for composer at the top of the issue: #2002304: [META] Improve Drupal's use of Composer

Mixologic’s picture

Can we get more details on why hosting our own packagist is a bad idea? Why does having drupal modules that are wholly dependent upon drupal api's in order to function benefit the rest of the php ecosystem? I get that we're trying to be less isolationist - but it seems like we're trying to mix our lego's with everybody else knex because they are all "building blocks made of plastic".

Crell’s picture

Mixologic: Packagist provides a single unified source for Composer-installable packages, including framework-dependent packages. It's not specific to any one platform. There are hundreds, maybe thousands of Symfony bundles on Packagist. Also Silex providers, Laravel extensions, Zend Framework, Cake PHP extensions...

PHP developers are increasingly used to "if it's not on Packagist.org it doesn't exist". For developers from outside of Drupal coming in, telling them "yes we use Composer, but we have our *own* Packagist because... reasons" is a slap in the face. It's one more custom Drupal moving part they need to think about on top of not being on GitHub, having our own CI server, using patches, having a separate login system, and the many other barriers to entry we already have.

I'd turn the question around: Given that Packagist is *right there* and everyone's already familiar with it, why wouldn't we use it? In the case of GitHub there are legitimate reasons why it doesn't suit our needs that justify the new infrastructure we're building. For Packagist, what's missing that we need, that wouldn't be better addressed by us being good community citizens and helping out Packagist itself?

The only one I can think of is the version number requirements (need a semver-esque string, which we don't currently use), but just running our own copy of Packagist or Satis won't solve that (same problem, different server), and there's a strong voice of Drupalers saying we should just get over ourselves and use normal version numbers for modules anyway, regardless of Packagist.

Fabianx’s picture

#91: On the other hand being dependent on both packagist and composer is also quite dangerous as there is a single-point-of-failure.

Recently we had some trouble with travis-ci and composer on the old legacy infrastructure with 100s of timeouts and very slow builds.

If composer.org is down, everything is down, if packagist is slow, everything is slow and while compared to GH it is at least OSS, it makes us hugely dependent on one centralized service.

While that is fine for Symfony, Silex + Co (as they never had their own infra and are also GH bound), it might make not sense for Drupal ..

Just some food for thought ...

mpdonadio’s picture

#92 Assuming modules are also committing the lockfile, and that they aren't constantly changing library versions, it's there a very high probability that dependencies will be in the bot's composer cache already, which will really help mitigate the single source problem?

timmillwood’s picture

Issue summary: View changes
Status: Needs work » Postponed

I understand the arguments in #92 but I think @crell makes a good argument in #91. The issues are, if we we want all Drupal modules available on packagist we'll need to make sure all modules have a composer.json (not so much of an issue) and fix #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc).

Postponed: Waiting on #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc).

derhasi’s picture

With Drupal Packagist we currently auto-generate the package information from the .info-files and automatically convert versions to semver. As long as contribs are not forced to contain a composer.json, we should figure out, how to make that possible with packagist.org. In addition, from my point of view Drupal packages should be auto-generated from the drupal.org source, so we can make sure every project is available.

#92 is not valid for me. Because the work needed to be put in a separate Drupal Packagist and make that reliable can also be used to support the packagist infrastructure instead. And I guess Jordi would welcome the second.

timmillwood’s picture

@derhasi - I'm not sure programatically adding a compose.json and converting from Drupal version to semver seems stable. It'd be better to just enforce semver and ask devs to add composer.json (if they want their module on packagist).

Crell’s picture

What we discussed in LA is letting it happen organically. Only modules with a composer.json are Packagist/Composer installable. OK, so module devs will figure it out themselves over time, or people will submit patches. No need for a big surge to get everyone a composer file immediately. Personally I expect it to last about 4 months after release before everyone having a composer.json file is the de facto standard anyway, just organically.

Fabianx’s picture

So I am approaching this again from a different angle:

The biggest problem we have with composer vs Drupal are the fundamental assumptions the projects make:

Drupal assumption: Modules are dynamically loaded and can be enabled and uninstalled. They can influence system state.
Composers / Symfony / most PHP frameworks assumption: Whatever is in the code base is enabled and active. System state is fixed.

So the main conflict is: dynamic vs. static

Composer statically processes things at 'app creation' time resolving all dependencies and everything.

In Drupal e.g. a module dependency is resolved just when the module is enabled and then enabling either fails or needed dependencies are additionally enabled.

--

In a hypothetical world where everything (every library, everything) was a Drupal module, it would all work as all dependencies and autoload-ed things would be registered correctly and dynamically with the autoloader, e.g. how we do it for our modules with addPsr4 on Kernel creation time. (and yes that can be done in a more performant way even dynamically)

--

In a composer-only world everything is static, enabling / disabling a module has no effect at all anymore except for hooks and discovery that is based on the list of registered modules in the container.

But all classes / namespaces are automatically registered via composer, regardless if the module is loaded or not and also all dependencies are in the code base and active, regardless if the module is enabled or not.

--

We currently have a very weird mix and we need to integrate things instead of working around limitations of either system (both have advantages and disadvantages), but both composer_manager in its current form and composer all the way via 'drupal/module' have its own problems.

The first question however to ask is:

- What world do we want to have?

Dynamic / Static / Hybrid?

And once we have decided that we need to ensure that regardless if the module was downloaded via composer, via drush, put in as a tarball from d.org or via the 'module installer. is equally supported in the same way.

e.g. if we decide we want to have all modules namespaces always there then this should just work.

So lets see what options we have:

1. All things being modules and being able to 'install' them, composer.json autoload information is parsed dynamically and added on module load time, there are no deep dependencies. Dependencies need to be resolved outside - similar to how a module that needs another module can download it right now. Core's composer.json dependencies that are only needed by a core module are moved to the module's composer.json itself.

Main thing: All modules autoload information is only available when enabled - drupal/module is explicitly excluded from being merged into main composer.json - regardless if managed via composer at all.

2. All things being composer, all modules _need_ a composer.json, d.org provides packaging a composer.json with the minimum information of autoload: psr-4: ...

There is no more special casing of modules in terms of auto-loading. All classes are loadable and exist.

3. A hybrid that makes sense.

I feel that is still the best solution to integrate both systems in a way that makes sense. I am still pondering solutions myself ...

What we have currently makes no sense at all.

e.g.

- drush dl memcache

Add to settings.php for default cache backend => fails (as module is not enabled)

- composer require drupal/memcache

Add to settings.php for default cache backend => works (as module is not enabled, but potentially still registered with the autoloader due to adding PSR-4 for usage by other systems and e.g. unit tests)

That is a fundamental thing to resolve.

[ I ran out of time on this post, but will revisit later when I have given it more thought. ]

timmillwood’s picture

@Fabianx these are some interesting thoughts in #98 but mainly things I don't think we can solve in 8.0.x.

Composer can download a Drupal module now (often via packagist.drupal-composer.org) and all non-drupal dependencies are downloaded too.

In think it would be awesome if modules were automatically enabled after being installed by composer, but we'd also need to think about the user who are not using composer. I hope by about 8.5.x there aren't any, but you never know. ;)

Fabianx’s picture

#99:

You misunderstand:

Those are things we need to solve for Drupal 8 - unless we want to have total chaos.

Still working on solutions, checked both current composer_manager and webflo's approach.

timmillwood’s picture

Right, I think I have misunderstood, because I'm not sure how there would be total chaos.

I believe I am currently following what you call "webflo's approach" and it's working really well. If we can close out #1612910: [policy, no patch] Switch to Semantic Versioning for Drupal contrib extensions (modules, themes, etc) it would make this approach work even better as it wouldn't depend on packagist.drupal-composer.org anymore and would use proper packagist.

yched’s picture

Ditto @timmillwood #101, I don't think I see why #98 means total chaos.

Yes, the code and classes from a module are not loaded / not autoloadable unless the module is enabled, that cannot change.
And the classes from a non-module lib are exposed to the autoloader the moment the library has been downloaded by composer - that is, the moment some module with a dep on the lib in its composer.json gets downloaded. This lib code won't actually be called until a module that calls it gets enabled, so a lib added to the autoloader because of a disabled module is dead code anyway ?

Not sure what's the issue ?

Fabianx’s picture

Created #2536610: Ensure all modules are autoloaded with PSR-4 only if enabled as a first step into a composer integrated world.

Jason Judge’s picture

Just to add to this, I think there is some confusion between the underlying dependency management of composer, and the module management of Drupal. They are two different levels of abstraction, so far as I can see.

composer will go fetch a package at install time, and also fetch all other composer packages that it depends on, if they are not already installed. It will not fetch anything at all unless all its dependencies can be satisfied, so it won't install a package and leave out something that it depends on. A package in this case is some PHP code - from a one-liner to a 10,000 line library.

composer will also provide a handy autoloader. All that autoloader does, is make classes (and I suppose namespaced functions) available. In any place in your application, if you ask for a class or method or function, the composer autoloader will go find it for you, because it knows where it is. Note: classes and methods, NOT modules. composer has no concept of how Drupal organises these classes into bigger components. It has no idea that package A is a library, package B is a module and package C is part of core Drupal - not even an inkling. So if Drupal needs to decide whether a module is or can be enabled, then that is something for Drupal to decide. composer will see no difference in the underlying classes between an enabled and a disabled module.

So if a module depends on another Drupal module that is not a part of the composer dependencies, then it is up to Drupal to check that before the module can be enabled and used. If a module has a composer dependency on, say, the Guzzle library, then that module should be able to safely assume the library is there, because it got installed (or confirmed as installed) when the module was installed.

Now, you could provide an alternative autoloader. That autoloader could be dynamically controlled, registering just those packages that Drupal knows are needed for the modules that are installed and enabled. But I see no point in doing that - the classes are there, and Drupal will use them or not, as it (and it alone) sees fit.

Whether the module itself is installed directly using composer (from whatever composer libraries or change control repositories they are in), or whether it is simply dropped into the modules folder and then a second command is run to make sure its composer dependencies are met, is probably something to discuss. Please note that the vendor directory where composer installs its packages by default, should NOT be web accessible. If it contains web assets such as images, JS or CSS, then that needs to be copied to a public area as an extra step. I believe tools such as bower and sass do that type of thing automatically, though I have not had much experience with those. Frameworks such as Larvel provide built-in commands for doing those file copies. That may help swing the argument concerning how modules are installed. Forcing all modules to be squeezed into composer may not be such a quick thing to jump into, especially if you don't want to add another six months to Drupal 8's release date. Allowing modules to have composer dependencies - absolutely. There is so much great work out there that Drupal could use.

I hope that helps to clear up a few things. When see suggestions such as modules managing their own vendor directories, I realise there needs to be a lot more understanding about how composer works, and how it sits at the centre of an application or framework and manages packages globally within that application.

-- Jason

davidwbarratt’s picture

It has no idea that package A is a library, package B is a module and package C is part of core Drupal - not even an inkling.

That's not completely true, see Composer's type and how that is used in Composer Installers.

Now, you could provide an alternative autoloader.

We already have an alternative autoloader, which is the crux of the problem. You can install a module, and Drupal's autoloader will pick up the classes. No need to use Composer at all.

Please note that the vendor directory where composer installs its packages by default, should NOT be web accessible.

Way outside the scope of this issue and has been discussed many times... #1475510: Remove external dependencies from the core repo and let Composer manage the dependencies instead

If it contains web assets such as images, JS or CSS, then that needs to be copied to a public area as an extra step. I believe tools such as bower and sass do that type of thing automatically, though I have not had much experience with those. Frameworks such as Larvel provide built-in commands for doing those file copies. That may help swing the argument concerning how modules are installed.

or just use Composer Installers to put the module in a web-accessible directory and not deal with copying the assets for now... which has also been discussed many many times.

Forcing all modules to be squeezed into composer may not be such a quick thing to jump into, especially if you don't want to add another six months to Drupal 8's release date.

What are you talking about? You can already use every module with Composer (Drupal 7 and Drupal 8).

Fabianx’s picture

I hope that helps to clear up a few things. When see suggestions such as modules managing their own vendor directories, I realise there needs to be a lot more understanding about how composer works, and how it sits at the centre of an application or framework and manages packages globally within that application.

Exactly that is the problem for our legacy audience, which is not composer aware. Composer wants to manage all (and that is totally great), but that is why when you integrate a modular framework / product with composer you need to integrate it and not just throw it on top ...

e.g.

as a MVP use case for legacy, it should be possible for a user to:

- Download module Y
- FTP module Y to the server
- When trying to install it, Drupal gives the message (by asking composer): "Library X is missing, require it via cd core; composer require 'foo' or you can download it here: (dist url) and place it in sites/all/libraries/composer-libraries." (and recursively for other missing libraries as several messages)
- Download library X
- Put to the special folder
- On module installation, folder is scanned and dynamically added to composer as fulfilled package (to be determined how - possibly as dynamic package source with package.json being in sites/default/files/composer/packages.json and pointing to the local filesystem via type = vcs; still need to figure out how to register 'installation status' though).
- Module can be enabled like usual.

That would be an integrated workflow working hand-in-hand with composer and useful for legacy and normal users alike (as the message in itself is useful).

Jason Judge’s picture

@Fabianx If library X is available through composer, what would be the reason for not using composer to pull it in? Is it just down to the people who would need to use it, not being able to use it for whatever reasons? It seems like a lot of work effectively providing an alternative interface to composer, but I guess it is something that can be worked on later; whether early adopters to L8 are generally going to be composer-savvy and not need that alternative installer, I don't know. Just FYI, you can install the non-phar version of composer and access all its functions through its API, so it can be "driven" from the back-end rather than the command-line if it comes to that.

@davidwbarratt The composer installers look great. It is not something I've really noticed being used out in the wild before, so overlooked them, though it may explain have Laravel gets its own nested-package structure in the vendor directory. Edit: just installed a WP plugin straight into a WP installation using composer. With dependency management in there between WP plugins at install time - analogous to module dependency resolving in Drupal - I can see how that could work really well. WP also has the issue that there are now two ways to install and update a plugin, and the two ways will happily obliterate what the other puts in. That's a bit uncomfortable.

As for autoloaders - you can run as many autoloaders as you want. So if it is easier to let composer autoload the packages that it installs, and Drupal to autoload any modules and libraries that are not installed by composer, then that would work. Oh, the day when I don't see another list of include()s...

Mixologic’s picture

Is it just down to the people who would need to use it, not being able to use it for whatever reasons?

Yes, exactly. There is a whole category of end users who have never used the command line at all, or may not have shell access to their server (think shared hosts) and the only thing they have available to them is FTP. They build entire sites without custom code, and their perception of Drupal is that it is a product, not a php framework.

Just FYI, you can install the non-phar version of composer and access all its functions through its API, so it can be "driven" from the back-end rather than the command-line if it comes to that.

Which has been considered, but it requires that you are using a web interface to write executable code (the autoloader) into a directory that the web server will execute. Security best practices dictate that the user that the webserver is running as should never be able write code that it can later execute. Wordpress gets around that by ignoring that best practice as a tradeoff for a better user experience.

If Drupal becomes another site building option in the php developers toolkit, then the strategy of how we use composer becomes dramatically simplified. However, Im pretty sure that optimizing only for the php developer use case at the expense of the site builders and less technical themers use cases has some pretty significant direction, strategy and market implications for Drupal as a product.

davidwbarratt’s picture

Drupal 7 lets you download a module (from the UI) and then execute it.. how is that following a best practice?

What if we wrote the autoloader to the private:// directory?

Fabianx’s picture

#109:

The module installer IIRC puts the file via FTP and not writes them directly to the web root.

The autoloader is not the problem; the overall integration with composer is.

e.g. the problem is not the module itself, which is autoloaded in various ways.

The problem is that by uploading a library Y to a folder, its autoloading and other information needs to be available as if this library was installed via composer.

And that is the integration we can reach - as composer provides all that is necessary overall.

We just need to figure out how.

Mile23’s picture

So, limiting the scope of the conversation, let's just figure out how to have core tell the user that dependencies are unmet: #2536576: Add composer_dependency module, have it help users figure out how to install Composer dependencies

David_Rothstein’s picture

@davidwbarratt is correct - the Update Manager in Drupal already writes executable code to the web root directly from the user interface. It will only try to use FTP or SSH if it can't write directly.

We don't recommend people set their servers up to allow the docroot to be writable (because it's not great security-wise), but if the filesystem is set up that way (as many are) we don't deliberately degrade the user experience.

That is why for many users, it would be reasonable (and no additional security risk compared to Drupal's current behavior) to just have the Update Manager run Composer stuff directly, as was discussed earlier above. But for users with a secure filesystem setup and who only have FTP access or otherwise don't have the ability/skill to run stuff from the command line - those are the users who would have trouble with a Composer workflow.

Mile23’s picture

But for users with a secure filesystem setup and who only have FTP access or otherwise don't have the ability/skill to run stuff from the command line - those are the users who would have trouble with a Composer workflow.

Those are the ones whose hands we're holding in #2536576: Add composer_dependency module, have it help users figure out how to install Composer dependencies.

The problem is that by uploading a library Y to a folder, its autoloading and other information needs to be available as if this library was installed via composer.

And that is the integration we can reach - as composer provides all that is necessary overall.

'Uploading a library as if it were done by Composer' is not a good goal.

If we are going to have a separate vendor directory for contrib dependencies, then we can just have Composer do it. This means that we a) need Composer as a dependency, and b) we re-make a version of embedded composer for ourselves. That was the original goal of this issue, and it didn't work for a number of reasons. Also it's way too complex and would be a Drupalism.

Basically the user is going to have to learn enough CLI to manage whatever tools are the best for this. Composer itself doesn't know how to look for module dependencies, unless we tell it, with a plugin. Drush might have that built-in already... I haven't had my second coffee so I can't remember.

Mixologic’s picture

In order to help non-coders, develop a browser plugin or drupal.org functionality that lets people 'shop' for contributed projects and assemble a composer.json file using point-and-click actions.

Just throwing out there that there was a bunch of work done as a GSOC project(proposal?) a few years back to make exactly this:

https://www.drupal.org/project/project_browser
https://groups.drupal.org/node/136549

Not sure if thats something worth resurrecting, or if we want to start over.

Crell’s picture

*sigh*

As discussed back in LA, having a GUI securely leverage composer to download a module is easy: have the PHP script ssh to itself to run a single composer require command, then composer safely does everything else. The problem is that PHP-SSH support is not widely deployed. Telnet would be writeable, too, if we wanted to go that route. But all of those require accepting a composer-first way of handling the files on disk.

We have to pick a direction and go with it. We had a direction picked in LA. See #60. We need to stop talking and do.

davidwbarratt’s picture

#115,

Is there a reason we can't just use the Composer library directly (sans command line)? I understand that wont work for source packages (that are cloned with git/svn), but for a dist it should work fine, no? I mean all it's doing is downloading a zip/tar.gz and extracting it... What if we had a GUI that would install any dist; but would tell the user to use the CLI for any source?

Crell’s picture

#116: Because "all it's doing is downloading a zip and extracting it" is not true. It's downloading a zip, downloading 0-n other zips, extracting all of them, possibly copying some files to vendor/bin, and then rewriting a few super-duper-important PHP files for autoloading. It's the "rewrites a few super-duper-important PHP files" that is the real sticker from a security standpoint. Even if we used Embedded Composer, we're still back at the most security sensitive files in Drupal being written to from the web GUI.

David_Rothstein’s picture

I created an issue for adding Composer support to the Update Manager (and marked it postponed on this one).

We discussed this all above and it seemed like it would be possible to do this in a way that works for a much much larger percentage of users than the ones who use PHP-SSH (which is indeed tiny) or telnet or anything else. (I don't see why rewriting "security sensitive files" via the user interface is worse than writing any other files - once you can add any PHP file to the system you can already take over the site. But if there's a problem with that, it could be discussed further on the issue I just created.)

Mixologic’s picture

From #60:

- Alex proposed creating a d.o "distro builder" that allows you to list the modules that you need, and get a tarball with packaged dependencies.
In addition, the system could remember your configuration so that when you want to go back and add another module, you don't need to start from scratch.
This was left undecided because it would be a big drupal.org change / DA investment, without completely solving our Composer troubles.

That doesn't sound like we've got a direction picked, but maybe Im unclear here. What exactly is the workflow for a site builder? I definitely see a lot about what we can't do, but it certainly isn't clear from the issue summary, or from #60 what it is we're going to do to solve that site builder/non developer workflow.

Mile23’s picture

What we're going to do for the site-builder workflow for 8.0.0 is *nothing.* :-)

We might help them understand what's going on with #2536576: Add composer_dependency module, have it help users figure out how to install Composer dependencies, but they'll have to take control of the CLI themselves.

This still leaves us with a bunch of work to do so that they can use composer from the command line without hiccups.

Fabianx’s picture

#117: That is an implementation detail of composer.

The autoloading information would not need to be written to PHP.

It could live also in the database or anywhere else, or even in a .json file.

It is also not particularly security problematic.

Also putting things into bin/ is pretty much irrelevant for our use cases.

The biggest remaining thing is the installation status.


Also that we do nothing for 8.0.0 for site builders is not correct, but I agree on adding a composer_dependency module as a first step, though I think it should not be a module, but core functionality.

We already depend on composer for core, so we can as well make use of it.

joshtaylor’s picture

Just a note, a distro builder would not work, as when a library updates and a module requires the latest version (say address requires the latest addressing update from composer) the site builder would be SOL.

Mile23’s picture

The autoloading information would not need to be written to PHP.

It could live also in the database or anywhere else, or even in a .json file.

Autoloading of stuff in vendor/ happens through a number of PHP files generated by Composer. See one here: http://cgit.drupalcode.org/drupal/tree/core/vendor/composer/autoload_rea...

We should let the Composer toolset handle that during the install/build phase, and not runtime.

During runtime, Drupal determines classloading for enabled modules and then adds their namespaces to the loader object generated by Composer.

We don't want to emulate this for our Composer-based dependencies from contrib, because Composer is already better at that than we'll ever be.

And again, since those dependencies are specified in composer.json files, we should let the Composer tools do that. How? Good question. :-) Just not from the web-based UI, for reasons mentioned above.

We already depend on composer for core

Not true. :-) http://cgit.drupalcode.org/drupal/tree/core/composer.json

Again, all of the stuff we do have in core that's *from* Composer is auto-generated *by* Composer.

Fabianx’s picture

#123: I have written a composer based alternative faster autoloader, so I am well aware of autoload_real.php, its strengths and weaknesses. Please don't try to educate me on composer, autoloaders or what can't be done.

I am deliberately going another way than what has been discussed, because I don't agree with the idea that without composer installed, we are not supporting it.

We have gone to great lengths to support shared hosting, but now here suddenly we should not do that, because the 'holy' composer has just this one way of doing things ...
(statically with PHP writable files)

Also we still have PHP storage as an option to write the autoload files - in the worst case.

e.g. we could decorate what composer does.

However the current autoload_* is not even efficient, takes 1-2 ms alone (of 5-6 ms for a page cache hit) just to load the autoloader.

I am also well aware of how package management works in composer and how e.g. Typo 3 parses composer JSON files for their legacy modules - not installed via composer.

So if our direct 'competition' can do it, so can we.

And please stop selling it as the holy grail, it is just software - with strengths and weaknesses.


Yes, in Drupal 7 composer manager is really straightforward with using sites/all/libraries, but in Drupal 8, this is indeed more complicated - especially with not updating all of core/vendor.

Or downgrading to beta12 by running composer from the root :D.


Yes, lets add composer library to core then, that is the only way shared hosting will be able to use it with some degree of support at least.

Mile23’s picture

And please stop selling it as the holy grail, it is just software - with strengths and weaknesses.

It's not that it's the holy grail, it's that I don't want to make a lot of Drupalisms. We should use it because it's very good at telling us when packages can't resolve dependencies, not because of super-duper autoloading or whatever. Also we don't want to reinvent the wheel.

The problem right now is that we have a broken system for dealing with Composer with Drupal. Let's make it straightforward for people how to install things, rather than the current requirements (manually change the autoload.php file), while at the same time not making anything that's complex so that we can have this solution before 8.0.0 release.

totten’s picture

I just want to give a +1 on #35 -- it's quite an ideal arrangement to "composer all the things" and build a general composer UI (for editing "composer.json") with very little that's specific to any CMS. (Disclosure: I've been angling to implement a similar arrangement for CiviCRM's extension system, but it's hard to find time for unfunded projects.)

There are a few random things (from this fairly long thread) which I'd like to add on to:

1. Much of the UI is about searching packagist and designing "composer.json". Most of that can be done in pure JS (if packagist.org has CORS... or if someone sets up a CORS-enabled bridge). A JS implementation would make it amenable to embedding within multiple PHP applications/frameworks.

2. Deploying composer.json via PHP-SSH: You might want to look at phpseclib -- http://phpseclib.sourceforge.net/ssh/examples.html -- which implements SSH without PECL. I don't have any experience with their SSH class (beside the trivial example above), but their other stuff (X509/RSA/AES) has been quite functional. The general approach of the library is to use PECL extensions when available... and fallback to pure-PHP implementations when necessary.

3. Securely deploying a new `composer.json` is tricky if you want "one size fits all". But if you expect that different sysadmins prefer different deployment strategies (because they fundamentally want different trade-offs between security and ease), then it's more rote -- e.g. https://gist.github.com/totten/9ad0c41ef1d5e53fa9ec

4. It seems better to call `composer.phar` externally than to use the library mode. Firstly, it's easier to swap between different deployment strategies (like #3). Secondly, there's less risk of getting dependency conflicts between D8-core, D8-contrib, and composer-lib. (You're not likely to get dependency conflicts now; but Drupal has a much longer support cycle than composer, and things may very different in 2.5 years.)

joshuami’s picture

Adding a related issue.

Participants may want to follow the core conversation at DrupalCon Barcelona (https://events.drupal.org/barcelona2015/sessions/composer-and-drupal-8). We are going to try and get some remote participants a way to ask questions and bring up points for this session.

Version: 8.0.x-dev » 8.1.x-dev

Drupal 8.0.6 was released on April 6 and is the final bugfix release for the Drupal 8.0.x series. Drupal 8.0.x will not receive any further development aside from security fixes. Drupal 8.1.0-rc1 is now available and sites should prepare to update to 8.1.0.

Bug reports should be targeted against the 8.1.x-dev branch from now on, and new development or disruptive changes should be targeted against the 8.2.x-dev branch. For more information see the Drupal 8 minor version schedule and the Allowed changes during the Drupal 8 release cycle.

fgm’s picture

Version: 8.1.x-dev » 8.2.x-dev

Since 8.1 has not been released and this discussion is about a breaking change, bumping to 8.2.x.

andypost’s picture

Version: 8.2.x-dev » 8.3.x-dev
Status: Postponed » Needs work

Looks dependencies are mostly resolved

Mixologic’s picture

This issue really needs a summary update.

But, since it popped up in my feed, I figured I would provide some data which you may find surprising.

I analyzed the July 2016 apache logs for our static files server (the project tarballs and zips), looking for Unique Ip address/User agent combinations.

Here's what I found:

For D7 projects, there were a total of 235190 unique IP addresses + User agent string combinations that downloaded a module from our server.

Of that:
59576 had a user agent string that indicates automated build tools: egrep -i 'composer|wget|curl|ansible|python|chef' julyd7uniques |grep -vi 'drupal' |wc -l
46790 had a user agent string of "Drupal"
128824 had a user agent string that did not indicate build tools egrep -vi 'composer|Wget|curl|ansible|python|chef|drupal' julyd7uniques |wc -l

For D8 projects there were a total of 106343 unique IP addressess + User agent string combos downloading from our server:

28710 Build tools
63410 Non build tools
14223 Drupal user agent

This indicates to me that we have a significant userbase that downloads tarballs via a browser and does *not* utilize drush/composer or other 'advanced' build methods .

If Im correct in assuming that a User Agent of "Drupal" is the update manager, This also indicates that the Update Manager is, in fact, widely used, even in d8.

xjm’s picture

Title: Allowing modules with Composer dependencies to be 'used' as downloadable packages will break sites » Use composer to build sites
Project: Drupal core » Drupal core ideas
Version: 8.3.x-dev »
Component: base system » Idea
Category: Bug report » Feature request
Priority: Major » Normal
Status: Needs work » Active
Issue tags: +Needs product manager review, +Needs framework manager review

The framework and release managers discussed this issue and we believe the part of this that is a major bug is covered by #2494073: Prevent modules which have unmet Composer dependencies from being installed. We agreed to move the issue to the Ideas queue to evaluate it more thoughtfully, since it would be a significant change.

Also, agreed on the need for an IS update. Thanks everyone for your input here so far!

Mile23’s picture

Did the framework managers agree to the basic premise of the original issue, which is that D8 extensions should *only* be available through Composer? Or do we need an IS update because we've decided against that?

Because if we're not going to have tarballs for extensions, then we don't need #2494073: Prevent modules which have unmet Composer dependencies from being installed since Composer will figure it all out for us.

If we *are* going to have tarballs for extensions, then we should just postpone this issue on D9.

David_Rothstein’s picture

Does this mean that #2538090: Allow the Update Manager to automatically resolve Composer dependencies should be unpostponed now? To be honest, I no longer remember why I postponed it on this issue in the first place :)

philsward’s picture

I honestly have no idea where to post this and quite frankly, I didn't read all the comments, however my question is simple: "Why can't Drupal include the composer.phar bin and use it at a local site level for everything within the local site directory?"

Instead of making someone install composer to their server or user account, then jumping through a bunch of hoops to download core through composer or deal with whatever dependencies, make the composer bin a part of core. This way Drupal (and site builders) no longer has to worry about whether composer is installed and working properly, because it's available by the core installation files. If composer were packaged with core, Drupal could then scan for contrib modules and third party dependencies and automatically inject and update them behind the scenes on cron runs. Rather use the server bin? Flip a switch in settings.php.

I'm a site builder and for the last year, I've struggled to get composer installed and I've struggled to use it. I don't care how or why it works, I don't care to learn it, I just want to be able to use Drupal. How Drupal updates itself and installs itself and updates itself, is all stuff I don't care about. I just want it to work in the background and leave me alone, letting me work on building sites. When I install a module with external dependencies, I don't want to have to go digging through readme files and hope it points to a live updated link to online doc's that will tell me the command I have to run just to install it. Then I have to remember to update it periodically? So much overhead...

If Drupal is going to use Composer, it should INCLUDE composer, then automate composer so the folks who really don't need to know anything about, composer, don't have to be bothered with it. At the end of the day, we all have a limited amount of time. I'd rather spend my time building sites, not trying to figure out how to install a module and keep it updated by hand.

Just my $0.02

davidwbarratt’s picture

That's like saying... why make users install git? or shell for that matter?

If you want everything bundled together without having to install composer, you can download the tar.gz or zip file which does not require Composer.

mradcliffe’s picture

If you want everything bundled together without having to install composer, you can download the tar.gz or zip file which does not require Composer.

I personally find this comment offensive for site builders who want to build Drupal sites or for contrib developers who want to share their work with site builders and other folk. We should encourage people to post opinions, and not drive them away with imperatives.

Rather I think we could address

- "I've struggled to get composer installed and I've struggled to use it." (emphasis mine).
- "I don't want to have to go digging through readme files and hope it points to a live updated link to online doc's that will tell me..." how to "...install it." (emphasis and edit mine).

And then we can come up with a way to empower site builders to create maintainable Drupal sites that depend on composer dependencies, which will improve Drupal's reputation as easy to use.

joelpittet’s picture

+1 for including composer with core and letting it resolve the contrib dependencies.

Contrib doesn't and shouldn't package composer dependencies with the packaged zip because there are shared dependencies so that wouldn't work as it stands the way core does it.

davidwbarratt’s picture

I just don't think including composer resolves your problems?

I mean I wish we had a UI for Composer and installing/updating modules in Drupal just used Composer's API.

Maybe we should have a new issue to build a UI for Composer within Drupal?

mradcliffe’s picture

Thank you for clarifying your comment, @davidwbarratt. I also don't agree that packaging composer would resolve the problem.

borisson_’s picture

Using composer to resolve dependencies is a good thing, but shipping it with core sounds like a lot of trouble.
Downloading the zip is not a solution either, because of contrib modules like search api solr, commerce, address, swiftmailer (and a lot of others) that use external dependencies, they need composer to write the autoloader and vendor directories.

I really think that suggesting https://github.com/mglaman/conductor as a tool for people to help build their local installations of drupal is the best approach we can take. It simplifies composer for people who don't feel comfortable on the command line.

I'm not sure if conductor verifies the availability of composer on the local machine, but we could include that and provide a tool for installing composer trough conductor.

I think we should ask people that are having trouble with the current way of working to test conductor and report their findings and promote conductor as a tool in the php-community.

Mile23’s picture

You can, actually, just install composer.phar somewhere in your project and not globally. https://getcomposer.org/download/

Those instructions will put a composer.phar in whichever directory you run them in. Then you can say ./composer.phar install

But it's a bad idea to package the phar file with Drupal (or leave it there when your site is live), because a) Assume it's out of date when you use it, b) Unknown security problems, and c) That's another packaging step.

As far as making it easier, there's #2691003: Catch error(s) from composer install not being run and return friendly error

bojanz’s picture

We already have an issue about rewriting update_manager to embed Composer (can't find it cause our search is awful as always).
That could work.
However, I am worried about resource requirements. Composer is known to be memory hungry and take up to 512M of RAM, sometimes more. No way that will work on any shared hosting or small VPS. Which brings us back to needing to run this locally, then upload the result, in which case a separate GUI like Matt's Conductor makes more sense.

dww’s picture

The issue bojanz is talking about is already listed as one of the related issues to this one:
#2538090: Allow the Update Manager to automatically resolve Composer dependencies

philsward’s picture

@davidwbarratt

That's like saying... why make users install git? or shell for that matter?

Yeah, I don't use git either. I don't "need" git in my single person, non-programming environment. (That's the beauty of Drupal... you don't have to be a programmer to use it!)

---

I think I now see where the main disconnect with the argument "why it should be included" or "why it shouldn't be included". Developers look at this from a "Stage Local, Push Live" standpoint. While sure, that's probably the best practice, certainly from a developer standpoint, there are a lot of Site Builders, (myself included) who don't use that workflow. They unpack (or upload) to live and manipulate live. There is no local staging workflow.

Let's take into account all of these cPanel hosting services out there that offer Drupal as a drive-by-install through the GUI dashboard. If someone doesn't know anything about Drupal, or git, or composer, they now have to learn all of these tools just to use Drupal which was provided as a super easy install from their cPanel dashboard. Don't forget the new requirement to learn shell and linux commands. Maybe they won't need those tools in the beginning, but if they see Commerce, or some other module with dependencies, it's guaranteed they'll get scared off.

So, when a Developer is looking at this, they are updating local, working out all of the bugs local, then pushing up to live. Developers have the knowledge and need for those tools and workflows. However, someone who is working from live, probably wants to stay as close to the GUI as possible. Regardless of whether you're local or live, keeping the dependencies up-to-date is a necessary evil. Unfortunately, up to this point, it's really only been approached from the local dev standpoint. Can you install composer live? Yes. But that doesn't get rid of the issues surrounding the new requirement for a live user to open up a shell, learn the commands and pray to the Drupal/Composer gods it works...

This brings me back around to "my" thinking of having composer packaged with core. It removes the need for the "live" users to ever need to know anything about Composer. If composer is packaged with core, that means it gets updated, every time core gets updated. And if composer.phar is already a compressed bin, I don't see how this creates more overhead... It's one additional file to add to the .tar.gz. I also don't understand how it presents "security" problems, because it would simply be overridden with the newest version of Core, which was released because of a composer.phar security flaw. I also don't understand the security implications when all of the dependencies pose individual security threats and the ONLY time they get updated on "live" sites, is when the next version of Core is released and the site maintainer gets around to updating the entire file base. We're more worried about composer being out of date on a live site than a bunch of Core dependencies being outdated? Come to think of it, composer can self-update. Why couldn't it self-update on periodic cron runs?

What I DO see from this, is that Core now knows about composer and can rely on it to be there and up-to-date with a version it knows it can work with. That means all of core's dependencies can now be updated (if needed) outside of a Core update. It also means that new modules can leverage the use of composer to automatically register and download the necessary dependencies without user interaction. It also means that "if" the core group wanted to, they could offer a package WITHOUT the core dependencies, and upon install, it would go fetch them automatically. It also means core could slowly move away from a full package update, and instead self-update with only the files needed for the update. (since the "live" site builders are a group of users who aren't using git to pull only the files that have changed...) It also paves the way for allowing Core to "push" security updates to sites that allow automatic security updates, which are typically stock sites anyway with little to no custom coding and typically the last ones getting updated anyway. Whether all of this is done in the end through composer, doesn't matter to me. I just hope to see something get done.

I'm not looking for rebuttal on this, I'm just hoping to show a different perspective on how Composer is perceived from a "live" site builder standpoint. I'm not good at dealing with a bunch of technical mumbo-jumbo CLI stuff. If I were, I'd be a programmer or a Linux admin with no reason to comment on this. I can however, click the mouse with the best of 'em though and that's where I like to keep my focus trained.

@dww Thanks for the link :) I'll follow that thread and see if something comes from it.

Olafski’s picture

Maybe we should have a new issue to build a UI for Composer within Drupal? (# 139)

There is already an issue for it: Do not leave non-experts behind: do not require Composer unless a GUI is also included

cilefen’s picture

mukhsim’s picture

Adding multi-site issue #2306013 to the list of the related issues: "In composer's data model you are running one application off of one codebase, and in the multisite model of drupal you are running multiple applications off the same codebase. So we have a collision here about to handle this sort of build." - comment by @mixologic.

groovedork’s picture

This is such a user hostile idea.

Pretty ironic: while "getting off the island" technologically, Drupal moves onto another island.. because now only highly technological experts can install it.

I don't have a command line
I don't want a command line

I don't have a 'staging environment'
I don't want a 'staging environment'

I don't want an electron app
I don't want to get in touch with my hoster

I am flabbergasted that in Drupal 8 you now apparently need Composer to install the address module.
I mean, some hightech command line system with obscure incantations to put some files on a webserver.

"We're making it easier". No, you're not.

"We're making it more secure". Yes, the few people left will have an easy time getting a secure system. The rest of us will be forced back onto Wordpress.

Yuck!

Why can't Drupal be both secure and easy to use? There is not rule against it. No natural balance that has to be kept, where Wordpress is the easy system and Drupal is the hardcore one. This is the internet, there is no roof.

lussoluca’s picture

Hi groovedork,
have you tried http://composy.io? It lets you choose the modules and themes you need from an easy to use web interface, then it runs Composer remotely and all you have to do is download the package and upload it to a server.

groovedork’s picture

This is what actually is happening:

...
4) Drupal 8 gets released.
5) Site builders have a short WTF experience when they are forced to use Composer.
6) Site builders feel forced out of Drupal, lose faith in the community's ability to understand site builder needs, and start looking for alternatives. "Didn't somebody fork Drupal7?"
7) THE END.. of Drupal?

I am very much inside the WTF moment right now :-)

groovedork’s picture

@lussoluca - that's cool, but I don't use open source software to then have to use a third party commercial service. It's lie putting a band-aid on an amputation.

timmillwood’s picture

@groovedork - Maybe Backdrop would be a better fit for you? I've been working with many enterprise level companies and all they want is a composer workflow. I've also been working with an agency who do a lot of PHP dev but not Drupal and their first question was "So you'll just a supply us a composer.json?". This is just how most modern PHP platforms work.

I seriously believe Drupal is changing, and yes change is often sad, but it's not a hobby platform anymore.

alison’s picture

Damn it was like, 4 weeks ago that I decided to skip ahead to 8.4.x when building a mini distro for our team, and oh-em-gee the things I've learned about what is (or may be? i'm honestly not sure) coming down the pike for Drupal. I can't believe how close I was to remaining blissfully unaware for at least 6 more months...

ANYWAY! Thoughts that are actually relevant, hopefully -- I actually read the vast majority of comments on this thread, but I'm sure I still missed some -- but anyway, again:

  • I'm extremely concerned about composer's memory usage, for the sake of ppl on more limited hosting, or heck, me doing dev work on my own local environment, holy crap my computer was so much faster 4 weeks ago... Also, it's just plain frustrating to have to wait multiple minutes just to see that I'm still getting the same dependency error. I know composer isn't actually *slow,* it's just doing a TON of work for me, so of course it takes time, but omg, it takes so much time.
  • RN some people have composer-managed Drupal sites, and some people don't. I'm sure there's a reason why that can't continue, but, could someone explain to me why that can't continue? (allowing either method)
  • Was a decision made re: multisites, or not yet? I feel unsure, just based on what I've read here.
  • Holy cow, @bojanz, thanks so much for writing up and notes from that BoF two years ago.
  • @crell's earlier presentation of the situation (starting in #35) was compelling for me -- I can see how repairing/improving the structure/fundamentals/etc. of Drupal + composer would be a prerequisite for going in the weeds of coming up with an admin GUI for site builders to use.
    • I'm sure one already exists, but could someone point me to an issue that's focused on a "Drupal + composer" admin GUI?
Olafski’s picture

@alisonjo2786 - I guess that's the issue you're looking fore: Do not leave non-experts behind: do not require Composer unless a GUI is also included

fgm’s picture

@alisonjo2786 regarding memory usage and install time, a few tips may help:

  • in most cases, the PHP "cli" SAPI used to run composer commands will have much higher memory limits than the web SAPI, often only limited by the host VM
  • composer will be much faster if you don't have the xdebug extension loaded in the CLI configuration. Even having it but set to disabled will be faster than leaving it enabled, although the biggest gains from removing it (with PHP 7 expect a factor of 3 in install time between with+enabled and without).
  • waiting for composer to fetch requirements is less frustrating if you add -vvv to your composer install command : it will show you what it does, instead of staying mute for several minutes
Mile23’s picture

alison’s picture

@fgm -- Thank you!! I was finally trying this out today, and discovered that even though I have PHP 7.1 installed, my system is using 5.6, so that'll be my project tomorrow haha, and then I can let y'all know how great it is with 7.1 *and* xdebug disabled for cli.

In the meantime, I lol'ed at "waiting for composer to fetch requirements is less frustrating if you add -vvv to your composer install command : it will show you what it does, instead of staying mute for several minutes" -- fantastic idea!

AdamPS’s picture

This issue seems more polarising/contentious that most. Possibly this is partly because some people are suggesting that every D8 site must change even though other people don't want/need to change and are pointing out problems.

Is there any reason why we can't move forward as normal for the Drupal core deprecation policy: avoid break existing sites, don't remove function until next major version = D9? I.e. add composer integration and also keep the tarball based installation working (= via GUI or automated via drush).

Current status is:

  1. Users running composer and content
  2. Users running composer because they need modules that have dependencies, but finding it difficult
  3. Users installing tarballs because can't use composer (not integrated/outstanding issues), but would like to modules that have dependencies
  4. Users installing tarballs and composer is not necessarily suitable (memory limit - I have seen composer fail even with 2GB; shared hosting where FTP is only option; auto-installers e.g. Softaculous; etc); can accept limitation that can't install modules with dependencies or might do manual FTP of one simple dependency

How about if the composer enthusiasts could work on #2845379: Provide optional composer integration but don't force users to understand how to use composer (contrib or experimental module?), #2002304: [META] Improve Drupal's use of Composer and other related issues in the meantime leave the existing installer alone. At this point (2) and (3) could merge into (1) and most people would be using composer. The existing installer could be deprecated subject to investigating (4).

In the meantime, we need to keep the existing installer working, which mostly it should without any further development. However we do need to consider #2906637: [META] Drush and core compatibility is fragile.

pingwin4eg’s picture

@AdamPS Nobody's gonna break existing sites. Why do you think so?

Using drush is almost the same as using composer - both are "terminal commands".

AdamPS’s picture

Nobody's gonna break existing sites. Why do you think so?

Great, basically all that I am asking is not to break existing sites.

However:

  • The issue summary has a proposed resolution of "Remove the update manager".
  • The drush team are reporting that D8.4+ requires drush 9 which removes "pm-update".

I work as a site installer for simple sites. I have delivered sites to customers and they are automatically upgrading using cron/drush - if this breaks customers will call me. I have scripts/systems that call drush pm-* and co-workers who understand how to use the update manager. Composer appears to require me to move all existing sites into a 'web' sub-directory and alter my apache config. Moving to composer potentially gives me plenty of days of unpaid work.

The level of complexity to use composer on the terminal is substantially more than "drush pm-upgrade" and there is a lot more discussion around this in #2845379: Provide optional composer integration but don't force users to understand how to use composer. I had a first investigation of moving to composer which took about 3 days and left a variety of unresolved issues, for example

  • what is the composer equivalent of "drush pm-update --security-only"
  • running out of memory even though I have 2GB
pingwin4eg’s picture

>>> "Remove the update manager"
>>> drush 9 which removes "pm-update"

That's because they can't handle non-drupal dependencies. Furthermore they can't handle resolving conflicts if different modules require different versions of the same dependency. Composer can. I agree that some UI for it would be useful. But that anyway will require installing composer on the system.

>>> Composer appears to require me to move all existing sites into a 'web' sub-directory

Not at all. I suppose, you're talking about the https://github.com/drupal-composer/drupal-project - but that is just one of possible implementations. You can live with vendor folder in your drupal root.

AdamPS’s picture

@pingwin4eg Thanks for the reply. So it seems that you are proposing to break existing sites:-)

I am already aware that the existing tarball based install mechanisms cannot handle non-Drupal dependencies (beyond those shipped with the core tarball). However there are plenty of existing sites that do not require dependencies. The existing mechanisms need to stay to avoid breaking these existing sites.

In other words, Drupal currently meets BASIC needs but not the COMPLEX. Why do the people interested in adding the COMPLEX seem to be so keen on removing the BASIC at the same time? Please can someone explain why we can't have both mechanisms and everyone be happy.

In terms of your second point, one of the reasons people currently find composer difficult is that the instructions are lengthy, complex and full of choices. As far as I can see, option B keeps the existing directory structure but fails to update scaffold files. Not documented, but as far as I can see in D8.3 B seems to miss the "installer-paths" config for "drupal-library". If the community is going to support two options it seems odd that they are both variants of composer - why not instead support tarball or composer(A).

ressa’s picture

@pingwin4eg: Whenever somebody say

Using drush is almost the same as using composer - both are "terminal commands".

(#160) I refer them to Composer and Drupal are still strange bedfellows :-)

However there are plenty of existing sites that do not require dependencies.

Exactly. It would be interesting to know what the ratio is. Address module is mostly used as the go-to example why Composer is required, whereas Webform can't be used any longer, since it supports pure Drush install of libraries with drush webform-libraries-download, as can be seen here: Webform Libraries, thanks @jrockowitz !

pingwin4eg’s picture

@ressa And how that article is related to what I said? After reading it, Drush and Composer are still command-line utilities, aren't they?

AdamPS’s picture

I fear we are in danger of getting distracted into advocacy.

Some users find composer easy and want to use it directly. Others find composer complex and whilst they are keen to get the benefits, the don't want to learn to use it - hence these users need an integration. However there are also users who want to continue to use the existing tarball installer.

It seems easy, everyone can be happy, provided that we stick with "Nobody's gonna break existing sites"

Does anyone have a good reason why we need to remove support for the existing tarball installer in order to add support for composer? Otherwise I'm hoping we can get some +1 votes for the strategy in #159 where we step-by-step improve composer support whilst maintaining back-compatibility.

alison’s picture

Drush and Composer are not the same (in terms of lift/skill). I repeat: They are not the same. Tons of people are fluent with drush and find composer to be an untenable struggle. I completely understand why these CLI tools could seem comparable to many ppl who are already comfortable with composer, or just happened to learn it more easily than others, but they are not the same, I promise.

(Plus, as was mentioned, the things that are blocking drush from continuing to integrate well with Drupal are the same things causing trouble for continued GUI update/maintenance tasks.)

THAT SAID, I strongly encourage y'all to check out this solution-focused issue created by @Mile23 (a composer evangelist!), for continued conversation about how we can move forward *together*, please please please:
#2908394: Use Composer to build sites without forcing users to learn Composer

(Yes this thread has been contentious, but it doesn't need to continue to be contentious, please!)

AdamPS’s picture

@alisonjo2786 thanks for comment and support.

One message of #2908394: Use Composer to build sites without forcing users to learn Composer is to maintain back-compatibility and it's great to see more support for that. I am not so convinced by the other technical details in that issue, and suggest that we don't get too distracted by criticising that. I am aware that many experts on this thread have discussed the details in great depth, and don't want to rehash that.

First step, can we agree a plan to keep back-compatibility?

Mile23’s picture

Tons of people are fluent with drush and find composer to be an untenable struggle.

Did you see Jeff Geerling's Composer pain-points BoF article? https://www.jeffgeerling.com/blog/2017/composer-and-drupal-are-still-str...

The reason this is all weird is because a decision was made early on to keep the tarball like it is. That is, the reason Composer is more than slightly broken in Drupal 8 is so that folks wouldn't have to adapt to a new file system or build process.

This issue is part of where that conversation took place.

First step, can we agree a plan to keep back-compatibility?

We can agree that's a worthwhile goal.

I'm not going to write your BC layer, however, unless you specify it.

It sounds like your needs relate to drush, and how they're beginning to require that you use Composer to install drush per-project. I'm not sure what core can do about that. Some have pointed to a core-native CLI as a replacement, which would be useful, but would also be out of scope for a discussion of Composer integrations. #2242947: Integrate Symfony Console component to natively support command line operations

ressa’s picture

For everyone following this issue, amateescu is experimenting with an approach, where a service on drupal.org handles all the Composer dependency resolution, and it looks very promising: #2910136: Experiment: package PHP libraries in a single Phar file

alison’s picture

(ack, sorry, lost track of email notifications)

Did you see Jeff Geerling's Composer pain-points BoF article? https://www.jeffgeerling.com/blog/2017/composer-and-drupal-are-still-str...

Yes, _nods_, indeed.

Re: BC plan -- makes sense we'd need to map something out first, yeah. Hrmph. I think at this point I'm mostly following #2908394: Use Composer to build sites without forcing users to learn Composer, plus some recent interest in seeing where #2910136: Experiment: package PHP libraries in a single Phar file goes, I think it's clear that the scope of this issue is meant to be about improving the Drupal + composer integration, right?

EDIT: And now, #2912406: [META] Replace update_manager with a more powerful solution

mbaynton’s picture

After posting this comment I'll try and update the issue summary as well. The majority of the comments here are over 2 years old, and in the interim, tooling, officially sanctioned infrastructure, and documentation around "Use composer to build sites" has come into its own.

Remaining problems

There are of course problems and sticking points that remain, but I think they revolve around these concerns:

  1. Some semi-theoretical problems where version conflicts could arise if a site builder wants module A and B, both of which need dependency X, but quite different versions of X. As more modules are built with large dependency trees in tow, this will become less theoretical.
  2. The simple act of accurately reporting that a site isn't secure becomes very challenging to impossible, because module maintainers can't realistically be expected to issue security releases in response to all security disclosures in their 3rd-party dependencies, and there is no automated means to ascertain a given version of a package has a known vulnerability.
  3. What happens to the user experience if Composer is a problem for you. We have #2908394: Use Composer to build sites without forcing users to learn Composer and #2845379: Provide optional composer integration but don't force users to understand how to use composer to dive into the weeds of the "non developer user experience", although a key takeaway germane to using Composer to build sites in general is that it's bad user experience if the Composer dependency resolution underlying a pretty module shop UI cannot find an installable set of dependencies due to version conflicts in the modules the user chooses.
  4. Dries' priority to achieve automatic update capability, especially with respect to security, adds additional challenges:
    • Automated code changes should equate to minimizing code changes on a given installed site so as to minimize the risk of existing site breakage, but if we build (and rebuild) sites with Composer, it may make sweeping changes to that site's code. If anything can be done to keep code changes that address security fixes to only the security fix, then we really should do it.
    • Version constraints in an installed module's composer.json might hold a site back from receiving a security update to a 3rd party dependency, so turning on automatic updates and forgetting about it could create a false sense of security.
    • The Composer Dependency Resolver is very resource intensive, so it can't be expected to run on the sites themselves in order to affect an update.
  5. It might be nice for uninstalling a module to purge any newly unneeded 3rd-party libraries from the sitewide autoloader; this is one unresolved item brought up in 2 year old comments.

Proposal

Okay, so here's something I haven't seen anybody propose yet that would help with most everything summarized above.

First, imagine there was a composer repository that, by definition, contained only packages that were compatible with each other. This would mean, for one, that within that repository, you'd never see two major versions of a semver-abiding project. And suppose that this composer repository contained not just packagist things or packages.drupal.org modules, but both. This repository would not contain the entirety of packagist or all 30k drupal modules either, mind you, only the heavy-hitters: packages and modules that the usage data suggest would be sufficient to build lots and lots of sites. A particular version of Drupal core would be among them. The repository would have a predefined and well known end-of-life, at which point everything in it would stop receiving updates.

So that's hopefully not seen as too radical. Honestly, I might have dismissed it as radical and unattainable myself, but we have proven examples from outside the PHP world of that kind of repository being sustainable and solving the problems we'll otherwise encounter. Isn't this precisely what Debian, and CentOS, and Fedora are providing you when you choose to use one of those instead of building all the packages on your server straight from source? These Linux distributions lock you into a feature set for the duration of that version of the distribution in exchange for confidence that everything in the distribution is interoperable and will receive security updates but otherwise what your server did yesterday is the same thing it will do today.

I'm suggesting we figure out how to do for the PHP community what Debian, and CentOS, and others have done for the Linux community. It's what works.

Benefits of proposal

  • Problems with dependency version mismatches are no longer possible, because only versions of packages and modules that are compatible with each other are available for installation.
  • A service the repository's maintainers would provide to its users would be to tag new releases contained within it that are security-related in an agreed-to, standardized way. So it would be possible to automatically check if a site is secure.
  • Once the repository is born, new features and code changes in general would be kept to a minimum, following a similar policy as you see Debian's maintainers follow for what should be backported from upstream. Hitch your site's wagon to this repository, and you are choosing to have a trouble-free experience but not new features until you are ready to take the trouble (read: pay an agency) to switch to another repository with newer packages. If that feature lock-in is too restrictive for you, packagist and packages.drupal.org are still there for appropriate audiences.
    I strongly believe this layer (the new repository) is a must before we propose to automatically update sites in the age of Composer.
  • Within the repository, the version constraints found in its composer.json files would be modified from upstream to specifically reference a version of the dependency from this repository, but would always allow newer versions of the dependency. (So if the repository was called "foo", something along the lines of "~1.0-foo". Yeah, you might need to add some new constraint operators to Composer instead of ~ that understand that notation.) In this way, unnecessarily restrictive version constraints buried in a module's composer.json wouldn't exist to hold back sites from being secured.
  • The dependency resolution algorithm required within such a repository could be radically simplified. The latest version of any package or module in the repository is by definition compatible with the latest version of any package or module it depends on from the same repository. Really, it's not "dependency resolution" anymore, but a much simpler "dependency satisfaction". You just build a list of all packages required one or more places and make sure you've downloaded them all. As a result, it would not be too memory-intensive to perform this step locally on the sites that are being updated, which should help both with the technical challenges of executing automatic updates and with adding/removing things from the autoloader.

But how would such a repository be maintained?

Well, it would obviously be a certain amount of work. But again, Debian and others do it, and have done it for decades, on a volunteer force of package maintainers, with consistently high quality results. In part that's because often a Debian package maintainer has their own interest/need for the package they maintain to exist within this stable ecosystem; the analogue in Drupal might be that site owners or agencies with ongoing support contracts would pick up one or two of the packages they personally needed. Also, if such a repository started to gain traction, its utility would organically extend beyond Drupal, so contributors could be drawn from a wider community than just Drupal as well.

Over the weekend I've written some code that is analyzing every project on drupal.org, and for those projects that use Composer, is recording the project's usage statistics along with the complete set of (direct and indirect) dependencies needed for that module to work. That's taking a little while to run as you can imagine, but I'll be able to make some data-driven inferences about how many packages / modules are required to handle n% of sites in coming days.

I spoke with a Debian package maintainer a bit about how they do it as well, and he mentioned that they've built up subject-area teams that are collectively responsible for a set of related packages. This helps to spread responsibility and be more assured that there are people behind each package. Package maintainers aren't expected to be Johnny-on-the-spot when there's a security issue with their package either; rather those are almost all handled separately by the Debian security team. There's also ample automation around creating and testing packages.

and in conclusion...

For non-developers, the move to Composer is a sorry state of affairs, for reasons more fundamental than just whether they've seen a command line before. It's well established by now that the solution is not to take Composer away from Drupal, so the solution has to embrace it. A Composer repository whose audience is non-developers and anyone who wants a create-it-and-forget-it website would take away the hurt. We should go build that.

Mile23’s picture

The way to solve the concerns of whether your individual Drupal site has irreconcilable Composer-based dependencies is to test it.

That is, have a process where you assemble the dependencies and then it tells you whether some are irreconcilable. Then you address the problems of individual dependencies by 1) patching whichever contrib is in conflict so you can perform the build, 2) giving the patch to the project maintainer, 3) updating when the patch is committed.

Adding the ability to patch through Composer is making progress: #2912365: Allow the drupal/drupal Composer project to apply patches (In fact, it's something you can already do with drupal-project.)

We're also trying to get moving on automating this dependency bounds testing process for Drupal core, and eventually for contrib: #2874198: Create and run dependency regression tests for core

My main problem with the proposal in #172 is statements like this:

So it would be possible to automatically check if a site is secure.

That is not a promise anyone can make, much less the DA.

But there's lots of due diligence already. For instance: https://security.sensiolabs.org/check

dsnopek’s picture

#172: Thanks for proposing this idea! I think the idea is sound, at least as far as the "theory behind the idea" goes. I think that could serve to solve the problem of non-developers using composer to update their sites, and to allow automatic updates, with a lot fewer surprises.

The challenge here is going to be rallying people to actually do that work.

This actually reminds me of the sort of thing we've done with Panopoly, where we have .make files with modules that are known to work well together (including patches) and some known good starting configurations, so you can, for example, have a stable Media configuration which is regularly updated and shared, rather than individual users having to figure out how to manage the sea of versions and patches and configuration individually.

However, what I've learned from that experience is that it's a ton of work. :-) And what you're describing is on a much bigger scale.

So, I'm not saying not to do it. It's totally doable if a team can be assembled to do that work, and to stay on it for the long haul. But, I suspect it's going to be a huge amount of work.

mbaynton’s picture

In assembling such a team, we shouldn't be afraid to look beyond Drupal. For example WP plugin developers struggle with the same compatibility issues so might get value from participating.

mbaynton’s picture

@dsnopek yeah, so the ton-of-work part.

I completely agree, the hard part is rallying people. The concept is simple and the coding is not going to net anyone a CS Ph.D, it's more a matter of just doing it.

The thing is, I'm sure more overall human hours are going into less-than-automatic updating, verifying, and fixing of sites one by one now than would be necessary if they could use a more stable, feature-frozen repository. It's a matter of finding enough people who have that job and redirecting the effort being expended, to everyone's net benefit. And as a bonus, it'd make Drupal accessible to non-developers again.

People-rallying is not my strength I'm afraid, so definitely open to help / networking / advice in that area.

As an experiment, I'm going to create a proof-of-concept repository that at first just satisfies an installation of Drupal 8 core, and begin maintaining it in a conservative, feature-frozen manner, to see what the workload really looks like. A useful-sized repository obviously isn't sustainable or viable for a single person to attempt though, so it's just a beginning. Hopefully it will help others to see the value and offer to contribute.

It will also redirect my time for such things away from my work on a new web-based updater, but after mulling for 5ish days I think this is more important and fundamental.

Mile23’s picture

I'm still not sure what's accomplished by curating packagist.

mbaynton’s picture

@Mile23, is that because you don't think there are problems in the first place, or because you don't understand how a repository like I describe would solve the problems?

Mile23’s picture

Well it seems like the problem you're trying to solve is that various packages have different requirements, and so therefore you want to limit the available requirements.

mbaynton’s picture

Yeah, basically. What I'm really trying to limit is not the set of available dependencies, that's a negative side-effect for reasons of practicality. I'm limiting complexity and the creating of distinct, heterogeneous overall codebases from site to site.

A repository where everything's compatible within it limits the permutations of dependency versions an unsuspecting non-developer site builder might end up with. This makes testing how exact versions of dependencies interact with each other ahead of time more achievable. When tests don't catch a problem, it also increases the set of people likely to be affected by exactly that problem, and so makes it more likely as a non-developer that somebody out there will fix it, hopefully as a minor release to that version upstream, or in the repository if upstream isn't supporting that version anymore.

All this works exactly the same with Linux and its distributions. There's a reason people use a curated Linux distribution and not git clone; ./configure; make for everything they want to run on computers/servers if they're not a Linux kernel hacker. Linux distros take all the complex interdependencies among C libraries, finds one of the many possible ways to fulfill all of them, and gives it to you as a finished product that just works on day 1, will remain stable, and will get one-button security updates covering the entire distribution for a predefined length of time.

Linux would not be practical for a majority of its users if these distributions didn't exist. That's a fact. I sure couldn't / wouldn't bother to get it working when simpler competitors exist. With a complete Drupal installation resembling more and more a complete Linux installation complexity-wise, the same thing applies to Drupal.

I think it's generally agreed that Drupal 8 is already making many users "feel left behind" (Dries' words) on account of the frequent pace of feature-adding updates, difficulty to set up a site in the 1st place, and difficulty keeping it secure. Those would be the same pain points in attempting to maintain a linux system by using the latest releases of everything all the time, direct from the original authors. It's the same problem for Drupal and for Linux, and the same solution works for both too.

The benefits are for non-developers, and developers who just want to build a quick site and forget about ongoing maintenance. If you're thinking like a developer and you have the time, knowledge, and infrastructure to composer update your codebase in a test environment, verify it's all still good, and fix anything that broke, then you have no problem.

Mile23’s picture

A repository where everything's compatible within it...

Uh, stop right there. :-)

So let's look at this other issue: #2842431: [policy] Remove PHP 5.5, 5.6 support in Drupal 8.7

You'll see that upstream changes to Composer-based dependencies mean that we're either going to have to drop certain PHP versions or be locked in to specific dependency constraints forever, thus forcing us away from bugfixes, security benefits, and features.

You'll also see that a whole bunch of people are doing a whole lot of work already on the problem of dependency compatibility.

How, exactly, would limiting the options for available packages based on the needs of contrib aid in this effort?

mbaynton’s picture

@Mile23, that's substantially a conversation among people heavily involved in advancing Drupal core. My main takeaway from #2842431: [policy] Remove PHP 5.5, 5.6 support in Drupal 8.7 is that people who are busy advancing Drupal don't have time or inclination to start maintaining a fork of core's dependencies in the interests of maintaining more inclusive system requirements for the overall project. That's not surprising; it's very rare to see upstream maintainers do that in the wider FOSS world as well.

The impetus for open-source software is usually some developer's personal need / desire to create it. Core maintainers have no use for security-supported but otherwise old versions of stuff themselves, so they're not rushing to make that happen. But, I believe large segments of Drupal's user base do have use for exactly that. It allows existing sites to remain relatively unchanged, which is useful for smaller operations that don't have comprehensive testing they can run, it makes keeping your site up simpler just cause you don't have to update it all the time, and it holds system requirements lower too. (PHP 7 is not a big deal for devs, but there's a reason WordPress still supports 5.2 despite loud pleas for at least namespaces from WP's contributors.) It's from the community of people that run sites and would prefer existing sites to be more stable that I'd expect to find folks willing to help support older versions of packages, not from the people in #2842431: [policy] Remove PHP 5.5, 5.6 support in Drupal 8.7.

I think the whole reason that issue does not read "Issue summary: Let's require php 7 now. Comment 1: Great idea! composer.json requirements updated! [Closed, fixed]" is that there is at least recognition from the Drupal project that it inconveniences people, though.

geek-merlin’s picture

A repository where everything's compatible within it...

There is no such thing. Compatibility is not transitive.
So if 2 contrib modules both play well with core, but not with each other (there are lots), which do you take?

mbaynton’s picture

@axel.rutz the more popular one. The goal for iglue is a popular subset of modules that are sufficient to meet the needs of the majority.

cilefen’s picture

crag2012’s picture

I wish to thank developers and hackers creating, maintaining and developing drupal, I have been on a slow journey over many years jumping on around 6. I just potter creating personal use sites. I love opensuse and yast, it allows adding items and checking the dependencies of the additions including interactive problems with those already installed.
Up to 7 we just added modules into the installation and checked they worked for this instance for the user. Users just want a simple system to allow installation of modules, plug and play.
Firefox is gradually loosing users because of the the way the framework has changed and the add-ons fail to work and require major re-writes, volunteer developers do not want to put unpaid time to keep re-writing the code every few months.
We want a system that does not require a Masters in coding/software development just plug and play. No compiling, no hassle.
There needs to be a piece of software produced and the modules responsive to something similar to yast which does the job of checking. enforcing security and compatibility and then incorporating the module into the site.
Sorry my experience only goes back to analogue days of finding the right size ball!

nod_’s picture

Is this still relevant now that the auto update initiative is progressing nicely, projet browser is getting started, and all the composer work the DA has been doing?

effulgentsia’s picture

Status: Active » Closed (duplicate)

#2940733: Site Builder Tool/Project Browser initiative was created after all but the last 3 comments on this issue have been written, and as far as I can tell, the goals of that issue now completely encompass this one, so closing this as a duplicate of that one. Please re-open this if there's something I missed here that's distinct from that issue. Thanks!

effulgentsia’s picture