This is a dependency of:
#395474: Plugin Manager in Core: Part 2 (integration with update status) and
#395478: Plugin Manager in Core: Part 3 (integration with installation system)

In an effort to be better than that other CMS which just released a damn sexy package manager, we've got to catch up. The efforts listed above should provide a nice UI and the functionality to do it. A missing piece is package signing, something WP.com doesn't do AFAIK.

There have been a lot of discussions about this, and I'm sorry if this should be on another issue, but here is my idea about how and why we should use openSSL:

1. It's available on the majority of sites. Acquia Network reports that over 85% of sites phoning home have it. I imagine 10% of the ones who don't are sandboxes. Those who don't have it, will get a big fat warning and in 99.9% of cases, there is not issue. If they don't have a way to sign packages, that's a risk they have to take. If someone feels really called to make it degrade into some text boxes to shove in MD5 sigs, be my guest (it won't be me).

2. It doesn't require the user to know anything, or to go around hunting for md5 signatures or to do some funny JS tricks to get them from d.o. I think this is really important. The whole point of this venture AFAICT is to give "normal" users a very simple, usable way to keep up to date and install new extensions to Drupal. (A Drupal core updater is probably a D8 task given that Drupal is not setup to run from outside of itself very well).

3. It's damn secure. The only weak spot is the private key on the server, but we plan to lock dww in the server cabinet and have the key installed in his brain Johnny Mnemonic style. Should work, plus he might just start looking like Keanu Reeves.

4. It is possible to have a central key server, and mirror the files off to other places. This will be important for scalability.

5. It really shouldn't be too hard. Included is a tarball which gives you all the pieces you need to try it out.

I'm guessing that we would include the signatures in update-status reports, and for new installs, we would need a way to pull them via a webservice. That can be a follow up issue.

For the time being, here are the goals:

1. Create a key pair. Store the private on the server. Store the public in CVS as part of Drupal core.
2. Modify the packaging scripts to create signatures of every release and store them like the MD5 is stored now.

On the client side, I'll work on implementing the key verification and warning if SSL isn't installed in the issues mentioned above.

How 'bout it! Are we ready to kick some ass and make Drupal as easy to update as Ubuntu? (joking)

Best,
Jacob

CommentFileSizeAuthor
ssl_fun.tar_.gz9.83 KBJacobSingh
Support from Acquia helps fund testing for Drupal Acquia logo

Comments

Dries’s picture

This sounds reasonable to me and the code is relatively simple.

The rest of the world seems to be using md5() which would also be fine by me. I might be a tad happier with md5 even because I know everyone would have support for it. It also seems easier when there are alternative download sources -- the last thing people want to do is mock around with keys.

Personally, I wouldn't make this patch a dependency. People that don't trust our package manager, can download the package from drupal.org and do things the old way. So, given the state we're in and the progress we need to make, I'd classify this as a 'nice to have', not as a 'must have'.

JacobSingh’s picture

Yeah, I personally agree with you it is a "nice to have" for now. But I think some people feel it is absolutely necessary... Maybe there is a good middle ground? Perhaps a contrib module in D7?

The problem with md5 is that it is subject to MiM attacks. So if someone is able to compromise your DNS and give you a package you don't want, then they can just as easily compromise the thing that gives you the MD5 code.

Anonymous’s picture

Another concept which is worth review is to retreive the XML via https. Granted, this would require adding the root certificate for our CA to the codebase, but that shouldn't be a big deal. If we could verify that the XML that we received came in via a secure connection, we could trust the listed md5sum as being authentic. Thus we could trust the packages if they matched the md5sum. If we couldn't use https, then we could give the warning that "the server cannot verify that the site you are contacting is actually d.o. it probably is, but we can't guarantee it. proceed at your own peril." This should add an acceptable layer of security while only adding a small handful of lines to the file. (Someone know the likelyhood of an installation having curl?)

I would actually argue that this should be implemented in any case, since it (the XML) specifies where projects are downloaded from. Just because a package originally came from d.o doesn't necessarily mean that it is the package that I mean to install. Imagine 'upgrading' from a new version of CCK to an older version (hosted somewhere else) that has known vulnerabilities. The older one would still be signed, so it would pass this key signing test. Another scenario would be if the XML listed the download location for CCK as 'drupal.org/whatevergoeshere/troll-x.y.z.gz.' The installer would remove CCK and install troll in its place without realizing it. Fun denial-of-service scenario. Yeah. No reason to secure the downloads if we don't secure the listing of where to download from.

As far as the original post:
#1: The percentage definitely sounds nice. Could we consider combining these ideas?
#3: Does he have to talk like Keanu also?

The fact that no other CMS (off the top of my head) is currently using a secure method of installation should be more reason for us to do so. Worst case scenario it sounds like a selling point to companies wanting to adopt Drupal. Best case scenario it might actually help prevent a few installations from becoming compromised.

(#5: It's too early/late for me to try it out. I'll try to remember to do that in the (later) morning, after sleep. ;)

JacobSingh’s picture

Very good point there.

Joshua and I discussed this, and here is what I suggested:

The other trick (which is kinda hokey)
10:35 [July 7] is to just extract the pacakge
and look at the info file
if it is older than the current version
or not the same module
than we chuck it

Not totally elegant, but perhaps simple to implement.

Paul Natsuo Kishimoto’s picture

If we could verify that the XML that we received came in via a secure connection, we could trust the listed md5sum as being authentic. Thus we could trust the packages if they matched the md5sum. If we couldn't use https, then we could give the warning that "the server cannot verify that the site you are contacting is actually d.o. it probably is, but we can't guarantee it. proceed at your own peril."

As I understand it, Debian's apt system doesn't use SSL to download the package list. Instead, there is a Release file that contains the checksum of the package list, and is signed using a private key. The client has the matching public key. The client:

  1. Downloads Release and the package list.
  2. Verifies the signature of Release using the public key.
  3. Reads the checksum of the package list from Release.
  4. Verifies the checksum of the package list.
  5. Reads further checksums and package information from the package list.

When new packages are downloaded, their checksums are verified using the data from #5 (more information here).

One advantage of this approach is that the private key can be confined to the master repository—i.e. it doesn't need to be distributed, ever. The Release file, package list and packages can be downloaded and republished by an arbitrary number of mirrors. The security of the mirrors does not need to be controlled, because if they are compromised then the files will be detected as inconsistent by the client. Similarly the connections from the master repository to the mirrors, or from the mirrors to the client, do not need to be secured.

Eliminating any need for secure connections will ultimately keep resource usage lower.

Another point is that packages should only be installed from the repository in which they are listed. So, for example, in the far-off-future when a user is installing packages from both (A) drupal.org and (B) drupalmoduleserver.thirdparty.com, it should not be possible (or necessary) for a package list from (B) to point to modules on (A)...or vice versa.

Also, sha1 is in PHP now...why continue with MD5?

dww’s picture

Project: » Drupal.org infrastructure
Component: Code » Packaging
Issue tags: +Update manager

I just happened to find this issue. This needs to be discussed as a d.o infra issue, not as an issue in the package management queue, and it's also relevant for the Update manager in D7 core. I'm exhausted now, so no time to think about this or comment on the proposals here, but moving to a more appropriate place to have the discussion.

pwolanin’s picture

I discussed this previously with Joshua and in IRC, so I'm interested in how we can improve the current situation

eliza411’s picture

Status: Active » Closed (fixed)

Closing old issues. Please re-open if needed.

greggles’s picture

Status: Closed (fixed) » Postponed

Definitely still relevant. Can be postponed until someone is working on it if that feels better.

mgifford’s picture

Issue summary: View changes
Issue tags: +Security

Tagging for security. Hopefully we can get some support to make this happen.

fizk’s picture

I created Trusted Modules as a front-end for a package signing system. I hope we can make some progress on this issue.

pwolanin’s picture

dww’s picture

I just found these. The article was a useful read. Haven't looked at the github repo, but it might be promising.

https://paragonie.com/blog/2016/10/guide-automatic-security-updates-for-...
https://github.com/theupdateframework

irinaz’s picture

@greggles, does this problem still exists? If yes, could you give more details? If no, could we close it?

greggles’s picture

Status: Postponed » Closed (won't fix)

I think the problem space has changed enough that we should close this issue.

I believe new work in this area is coming out of the autoupdates initiative..