PHP Security

PHP Package Signing: My Current Thoughts

40
Image of a modern fountain pen writing in curs...

(Photo credit: Wikipedia)

We figured out how to write good code. We figured out how to write good code in a reusable way…for the most part. We figured out how to distribute and mix all that good reusable code in a sensible fashion. Can we now figure out how to do it all securely?

Package signing is a simple enough idea, and I’ve been spending time trying to fit it, Composer and Packagist together as a model in my head. The concept is to have parties create a cryptographic signature of the code to be distributed at a specific point in time using a private key. If anyone were to change that code or its metadata (think composer.json) with malevolent intent, the end user would then notice that the cryptographic signature cannot be verified using the package author’s known public key. It’s a familiar topic from all those cryptography books you’ve doubtlessly read ;).

Alright, it’s actually a horrendously complicated topic that boggles the minds of many a programmer. We’re a practical bunch, and we just want the damn code. NOW!

Practical considerations and security are locked in a continuous battle for primacy. Look at SSL/TLS – it is a security nightmare but we keep it around because, until someone comes up with a decent replacement, the alternative is no encrypted HTTPS with a verifiable host for anyone. We continue to support old versions of SSL/TLS out of practical concerns despite knowing their weaknesses. They are old versions for a reason!

Those same concerns have been at war in my own head since last week, when I made the mistake of contemplating package signing. Eventually, my practical side won out and my security persona has been sulking in a corner ever since refusing to talk to me.

The problem with package signing from my perspective is tied up in a phrase most of you would know: The needs of the many outweigh the needs of the few. Thank you, Spock.

PKI vs GPG (Some Context!)

I won’t go into too much detail here…

Right off the bat, we have two contenders for signing packages: Public-key infrastructure (PKI) backed by a Certificate Authority (CA) and Pretty Good Privacy (PGP) also commonly referred to by its GNU implementation, GNU Privacy Guard (GPG). You’d be most familiar with PKI in the form of the SSL certificates used online. Both have the notion of private keys and public keys. Data encrypted by one key can only be decrypted by the other key. If you keep one private, then holders of the public key can always verify that data sent by you was really sent by you. If you lose the private key, you’ll need to revoke it and get a new one.

Assuming, they trust it is you to start with!

Trust is the core difference between PKI and GPG. How do you know, with certainty, than any given public key is firmly associated with the person you know it should be associated with? Maybe it’s a hacker posing as that person? Maybe it’s the local friendly NSA office masquerading as Google? Establishing trust takes diverging paths for PKI and GPG. PKI keys (in the form of certificates) are either self-signed or signed by a trusted Certificate Authority. Generally, we put zero faith in self-signed certificates because anyone can claim to be anyone else using them. We instead trust a select number of CAs to sign certs because they’ll hopefully do stuff like asking for passports, addresses, and other person or company specific information to verify any entity’s real identity before doing so. GPG avoids centralised authorities like the plague and instead uses a “web of trust” model where everyone can sign everyone else’s public key, i.e. the more of these endorsements a GPG private key gets, the more likely it is that the public key represents the expected identity (based on the number of endorsers you already trust). Webs of trust require time, care, and effort, but they have been extremely successful and certainly do work.

Excellent, your brain is still not smeared over the monitor. That’s a good sign. Now, what the heck has this got to do with PHP and package signing?

People Are Lazy And Cheap

These are the two things you can count on with rational people, though perhaps using more charitable terms. People don’t want to do any more work than they need to, and they generally don’t want to spend any more money than they need to. Add those to one other – they generally don’t think about security when downloading code – and you have something of a problem.

PKI dies an immediate death in this worldview, because obtaining a CA issued code signing certificate costs money. That guy who wants to put 100 lines of code onto Github? That’ll be $100 please. Package signing using a CA will never work for PHP because it imposes a monetary cost on open source contributions. Many of us might not blink at spending $100 dollars to indulge our willingness to write free code, but we’d probably all prefer not to.

GPG is utterly free. Surely it’s a winner then? That guy who wants to put 100 lines of code onto Github? He probably has to now generate a new GPG keypair with a resulting public key that nobody has signed and which nobody will trust. He’s way past caring now. And so are the potentially tens of thousands of people who can’t verify his code. They want the damn code. NOW! Not months down the line when he’s begged, pleaded and bled his willpower dry trying to get sufficient signatures from others in the community to get a widely trusted public key. I’m exaggerating a wee bit, but the barriers to entry for GPG were intended to prevent weakness. There are shortcuts, but shortcuts undermine the purpose of a web of trust. Meeting in person is one of the most often quoted means of getting your GPG public key signed properly. You can find me somewhere in the Wicklow mountains of Ireland. I’ll wait ;).

From a security perspective, both of these options can and do work. Practically? They sort of stink if you want global adoption. One costs money, and the other requires time and effort for both package authors and users. In the real world, these problems are not uncommon. Throw GPG signatures at everything, and a tiny minority will actually bother checking them. Require CA code signing certs, and you will be ignored by programmers. These options, particularly given the distributed nature of PHP packages (i.e. github repos), will only ever serve a minority of users. Would you oblige us again, Spock?

The needs of the many outweigh the needs of the few.

Thank you, gravelly voiced Spock. Sorry about Abrams, we didn’t think he’d actually blow up Vulcan, replace phasers with blasters, and subtitute you with an emotional basket case because he thought he was making a Star Wars movie.

I just killed PKI and GPG as the basis for any proposal I’d make for package signing in PHP. Making <1% of the community extremely safe while leaving >99% of it extremely vulnerable leaves me with a bad taste in my mouth because it conflicts with my desire to spread security as much as possible to protect as many people as possible. I could blame the ignorance of users for the need to give up some notional perfect security, but they just want their code and that is the reality we live in. One might as well stand on a beach battling the tides if they are going to deny that Humans are, well, only Human. So, it’s time to come up with something else that can be easily implemented in PHP.

The Double Signature Alternative

The double signature approach to package signing relies on having a minimum of two parties verify the integrity of a package independently of each other. Luckily, we already have two parties: the package author and the Packagist repository. The basis of its effectiveness is probably obvious. Each independent signature requires a separate private key which is ultimately held offline. In order for an attacker to compromise a package, they would also need to compromise both private keys. This assumption rests on the notion that each party’s public keys are known by the user in advance and maintained in some permanent keystore, e.g. included in composer.phar or accepted upon first download of a public package and stored in a simple file that you can copy between systems (in effect quite like GPG’s keyring).

Let’s say that I tag a release for LibraryX 1.0.0 tomorrow. As the package author, I would sign it using a private key that I generated at home. I’d then advertise the public key widely for users to download and cache (our permanent keyring). Packagist, when building the metadata for the new tagged release, would also generate its own signature for the package. Our download client, which we’ll assume is Composer, will check both signatures and only accept a package for use when both signatures can be verified.

While I refer to this as a “double signature” approach, there remains scope for adding a third independent party. It also doesn’t necessarily impose digital signing on package authors. Package authors can simply not sign anything, but Packagist would still do its own signing. This significantly lessens security but it trumps the current situation where Packagist signs nothing at all. It should also have Composer differentiate by marking packages to be installed as verified or partially verified, while allowing options for users to impose a mimimum acceptable level of verification.

Happily, this line of thinking is exactly what is at the core of a proposed solution for both Rubygems and Python’s pypi called theupdateframework. A little external validation doesn’t hurt and my security persona might stop sulking soon. It can also be implemented entirely using openssl.

Signing A Package

At its core, signing a package for PHP is straightforward. For every single file in the package, you calculated its checksum, e.g. a SHA256 hash. You store a list of files and their hashes in a single file called the manifest. You then sign the manifest. Upon download, the user can verify the manifest’s signature and check that the list of files and hashes it contains actually matches the files in the package received.

If only it were that simple…

Metadata Is Dangerous

This part is somewhat technical, but hopefully it makes sense!

While we might suspect that files are important, securing something like Packagist requires an obsession with package metadata. Whenever you use Composer it doesn’t run off to Github, it downloads a file called packages.json from packagist.org. Composer quite literally relies on Packagist to tell it about available packages, versions and URLs to remote downloads and VCS urls. In the absence of a secure signature-based process, this creates a gigantic single point of failure for all Composer users.

If Packagist is ever hacked, an attacker can now respond to every single Composer request unchallenged. And Composer implicitly trusts everything that Packagist tells it. Basically, it would allow an attacker to poison the entire population of Composer users. That is simply intolerable.

Metadata is the core problem we need to solve. We need to prevent attackers from redirecting package downloads, informing us of incorrect available versions, replaying old copies of the metadata, and so on. There must also then be a way to recover from the attack. In other words, we need a metadata architecture that can survive Packagist’s downfall and allow for its restoration.

If we implement not just package signing, but more specifically metadata signing, then we immediately alleviate the problem. It’s still based on the primary goal of there being at least two private keys which are kept offline. The missing features that it also requires are fourfold:

1. Role Delegation
2. Timestamping
3. Signing Thresholds
4. Key Revocation

Role delegation is a simple mechanism where Packagist maintains private key(s) to sign all of its automatically generated metadata files. These keys are delegated to by one or more master root keys which are kept offline. If the server is ever compromised, the root key is not. This would allow the Packagist maintainers to, upon restoring control, revoke the online delegated keys and replace them.

Timestamping prevents attackers from reusing older correctly signed metadata by imposing an expiry date and version number. Old data will expire, new data will have a limited predetermined validity period. Versioning merely ensures the Composer client can check that metadata it downloads is fresher than an older copy, i.e. it can continue running off cached copies.

Signing thresholds are an opt-in measure where you can require certain metadata to be co-signed. For example, Packagist could require that automated key delegations are signed by at least two private keys (increasing the difficulty for attackers since they now need both!). You could do the same at the developer level for newly tagged releases.

Key revocation is another recovery mechanism. It allows the holder(s) of root keys to revoke delegated keys. After a compromise, you’ll want to replace the online private keys. This can all be done by simply root signing a new file which details the delegated keys (I’m trying to avoid too many implementation details so hit me up in the comments if you’re confused).

All of this works in concert, but with one crucial additional element I need to look into. We’re basically saying that blind trust in Packagist is a bad thing, even with signing, so we can’t only rely on Packagist or we’d be vulnerable to replay/rollback attacks. For example, we should have a means of independently cross-verifying versions back to their origin, the actual git repository for the package – for example, using “git ls-remote –tags” to compare release tags as a means of validating the correctness of Packagist metadata. I have no idea how that would work outside of git, but it’s an obvious validation method that is possible due to Packagist’s distributed nature.

Conclusion

This article was me thinking aloud but, if you follow the logic, it demonstrates that package signing is an understatement. We don’t want to just secure packages, we also want to secure the metadata that describes what packages exist. It’s a far more complex problem than first glances suggest. Luckily, it’s not an overwhelming problem. It is entirely possible to implement basic defences using openssl, public/private keys, and some independent validation separate from Packagist itself. Though your brain will appreciate it more when it’s automated ;).

Enhanced by Zemanta

Thoughts on Composer’s Future Security

9
Sign coconut

coconut (Photo credit: @Doug88888)

I’ve been spending a chunk of free time recently working on a few PRs for Composer related to security so this is my usual “let’s watch Paddy think aloud in a completely unstructured manner” blog post. But seriously, with all the issues and PRs going around, this is my detailed look at solving a “simple” problem: establishing trust in Composer’s source code and its operation thereafter.

The Composer issue, as initially reported by Kevin McArthur, was fairly simple. Since no download connection by Composer was properly secured using SSL/TLS then an attacker could, with the assistance of a Man-In-The-Middle (MITM) attack, substitute the package you wanted to download with a modified version that communicated with the attacker’s server. They could, for example, plant a line of code which sends the contents of $_POST to the attacker’s server.

The obvious solution is to implement TLS support…

To mitigate this risk, I updated Composer in a PR in the following ways:

  1. Peer verification is enabled by default. Disabling it nets you a continual warning message.
  2. It follows all recommended TLS options being introduced for PHP 5.6 (thanks to @rdlowrey).
  3. Since peer verification requires root CA certificates, Composer will attempt to locate a local system certificate bundle (thanks to @EvanDotPro).
  4. If all else fails, Composer bundles root CA certificates which it will fall back to.
  5. Users can override the default detected certificate bundle by manually setting a –cafile option for most commands.
  6. The Installer has also been updated for 1-5.

Composer should now operate with SSL/TLS protections out of the box. There may be edge cases since support for Subject Alternative Names (SANs) in SSL certs has not yet been added to PHP 5.4 or 5.5, but I’m hoping that future releases of these versions will see it added. This particular issue does not impact Packagist.

Mission accomplished?

There are other TLS related features that can be looked into for the future. Users may want to generate their own CA cert bundle file as a substitute, e.g. Evan Coury’s Sslurp, to avoid a single point of failure or trust. With the dawning realisation that government surveillance is commonplace, and that trusted CA’s may mistakenly issue, or allow to be issued, certificates for entities without that entity’s permission, public certificate pinning may also warrant future attention.

SSL/TLS should protect the TCP connections between the client and the server, but it doesn’t actually verify that the code being downloaded was published by a trusted maintainer – only that you downloaded it from a verified host. So, what if the server were compromised? What if the SSL/TLS connection were breached?

Throughout 2013/2014, TLS has been besieged by a number of problems:

  1. Weaknesses in the protocol: SSL/TLS is made up of SSL 2.0, SSL 3.0, TLS 1.0, TLS 1.1 and TLS 1.2. Newer versions tend to be stronger, and SSL is overdue to be phased out of existence. Aside from obsolescence, the protocols are constantly under the microscope. On 4 March, researchers released details for the new “Triple Handshake Attack”. You’re probably already familiar with terms like CRIME and BEAST from 2013. Another part of the protocol is how encryption is implemented. TLS may use quite a number of cipher suites (named sets of authentication, encryption and MAC algos) in any particular order of preference. Reordering the preference to favour stronger ciphers which have Perfect Forward Secrecy (PFS) as an attribute is essential to mitigate against the loss of private keys (genuine loss, stolen, demanded by a court order or NSL). Without PFS, one could decrypt previously logged requests once they had the private key. PFS is not favoured by default in openssl or curl, but it is as part of the PHP 5.6 overhaul in PHP streams.
  2. Weaknesses in the Certificate Authority: CAs, like any entity, are capable of mistakes. In the past, some CAs have mistakenly released, or had stolen, trusted certificates which were used to create new certificates to impersonate entities like Google or to sign software, e.g. as Adobe. This is one reason why it is essential for any software bundling CA certificates to have a process for updating them regularly. There have been lots of CA problems. For example, in 2012, Trustwave publicised a worrying practice among CAs. Trustwave had been selling hardware implanted with subordinate certificates for the specific purposes of generating keys for domains that their customers did not necessarily own, and they insinuated that other CAs had similar businesses. One can imagine what the NSA would do with such a machine.
  3. Weakness in implementation: Apple screwed it up big time in OS X and iOS. GnuTLS was just patched for a similar issue and I was installing updates yesterday. PHP’s native SSL defaults are both insecure and buggy, to be fixed primarily in PHP 5.6. The NSA was reading unsecured intra-datacentre traffic from Google, Yahoo, Microsoft and probably most of the planet beyond those three. Companies like to monitor what employees send over HTTPS (if you work for one, check your company browser to see what certificates are installed). There also remains a culture in programming (it’s not solely PHP) of disabling peer verification rather than deal with error conditions. Interesting note: Apple has also just managed to disabled the mitigation for BEAST in Safari 7 for all platforms other than OS X 10.9 and iOS 7 – leaving Windows and earlier OS X versions potentially vulnerable.

Relying on TLS blindly has another name: being naïve. You could write a book on TLS problems.

So even with TLS implemented and working properly, users will still have trust issues. The number of times I’ve seen complaints about piping code from a download straight into the PHP interpreter (as is suggested when installing Composer) makes that abundantly clear. This approach violates the idea of knowing what you’re executing but it’s more relevantly a problem with establishing trust. We implicitly trust those who write Composer not to attack us – there’s just nothing specifically binding that trust to composer.phar and its installer.

As the recent GnuTLS vulnerability has highlighted, there is actually an app for this! In the Ubuntu/Debian ecosystem we download code. Even with TLS broken (as we know it definitely was until yesterday), we still have some faith in the packages we’ve been downloading for years because they employ an additional defence against substitution – they are signed.

Package Signing

File signing is a simple notion. A trusted author/maintainer signs a file with a private key that only they possess. Everyone can then verify the signature using a published public key. So long as the private key is secure, and the public key is accepted once by downloaders (with due consideration AND not replaced unless absolutely necessary), then an attacker will find it really difficult to replace that file with a tainted copy. There’s a reason for that, and it resembles the benefits of TLS pinning: you are reducing the number of trusted parties to just one – the trusted author or maintainer of the code. Even when TLS completely collapses, you can still verify the file signature to detect any tampering.

I mentioned it earlier, but Composer is NOT a package manager. The only files that Composer distributes are composer.phar and an installer script. Since it distributes nothing else, it can sign nothing else. It’s a tool that resolves dependencies and then downloads code remotely.

With that in mind there are two parts, if one were to consider file signing support for Composer:

1. Signing composer.phar and the installer; and
2. Signing everything else.

With much of the focus being on securing Composer, the first part raises a few interesting questions. The main one being how to go about establishing trust. Everything boils down to trust and it has to start somewhere. If we think hard about piping code we just downloaded through the PHP interpreter, there’s the problem of explaining how this is wrong. Is it? Let’s say the installer were signed. It sounds secure, but now we’ve just managed to shift the problem elsewhere: we have to download an initial copy of the public key. How does one sign a public key? If we trust the public key, and verify the installer using it, and an attacker has replaced both, we’ve effectively achieved…nothing. In other words, you’re left with the original problem – it’s your problem.

The flaw in Composer’s installer isn’t that it’s unsigned, it’s that it doesn’t afford the opportunity for the downloader to read it before it gets piped to PHP. It’s a documentation issue. You can go down the route of using a CA, of course, but that’s further down the rabbit hole than may be necessary.

Signing the composer.phar file is another matter. In theory, in an ideal world, you’re going to use the installer precisely once. Once you have a copy of composer.phar, you can just distribute and reuse it locally. You wouldn’t be downloading the installer hundreds of times, providing hackers with hundreds of opportunities to attack you. You’ll be copying composer.phar around and using self-update hundreds of times. Signing composer.phar therefore makes sense. That first copy can enforce signature verification on each subsequent update using the original copy of the public key. It limits the available victims of an attack to a significant degree with the bonus of unverifiable signatures creating an immediate early warning system should a compromise at the server or TLS level ever occur.

So the fuss over the installer isn’t entirely well placed. Whether it’s the installer, or a public key, that first download will be unsigned anyway. It’s the subsequent downloads that you need to watch. That will probably be my next PR looking for review. It is possible to implement an encapsulated signature verification check for phar files using openssl.

Signing Everything Else

In much the same way that one may not trust downloads from getcomposer.org, there are two other broad sources for downloaded files: packagist.org and The Planet Earth. I’m going to examine the second, but it’s worth noting that, after all else is done, at some point we will download packages.json. This demonstrates another issue with signing things. If we’re assuming that the server has no access to the private key (since an attacker would love to see that), we have to assume that any automated packages.json updates will be unsigned. Yet…this file provides all of the package metadata consumed by Composer. Let your brain cells chew on that.

Another day, another blog post…

Back to global downloads: If we don’t trust Composer files, why should we trust anyone else’s files? They are all going to lead to PHP executing downloaded code. This is the wonderful realm of package signing. You can see this in action for Debian apt (also Ubuntu…obviously). It’s a simple setup that PHP could borrow. I say PHP, because Composer is not a package manager so whether you sign anything or not is not its call though it could perhaps support the architecture so it’s actually possible.

One method is very straightforward. In a manifest file, you list all files to be downloaded with their matching checksums, i.e. the calculated hash for the file. If a file is altered, its checksum will change. Rather than sign each file (which would be ridiculous), you just sign the manifest. Upon download, the client can verify the signature of the manifest and trust its contents (or reject it if it doesn’t verify!). Once trust is established, the client can then verify that all files downloaded are a) listed in the manifest and b) have a matching checksum.

This could be implemented via git so it’s already compatible with tagged release ZIP downloads, under the assumption that tagged releases would be signed. For dev-master, well, you have to takes risks sometimes I suppose. You can’t expect something like Zend Framework to sign each and every commit. Using the master of a git repo is a risky affair.

Enhanced by Zemanta
Go to Top