Closed Bug 1400844 Opened 7 years ago Closed 7 years ago

cryptohi: Implement handling of RSA-PSS signatures on certificates

Categories

(NSS :: Libraries, enhancement, P1)

enhancement

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: ueno, Assigned: ueno)

References

Details

Attachments

(1 file)

Currently the cryptohi library doesn't provide a means to sign and verify certificates using RSA-PSS. That prevents some tools, e.g. certutil -e or pk12util -i, working properly on RSA-PSS signed certificates. I'm filing this as a tracker of the following bugs, which are basically the same cause: https://bugzilla.mozilla.org/show_bug.cgi?id=1346239 https://bugzilla.mozilla.org/show_bug.cgi?id=1341306 https://bugzilla.redhat.com/show_bug.cgi?id=1432142 I will submit the patches on phabricator.
Blocks: 1341306, 1346239
See Also: → 1346748
Assignee: nobody → dueno
Status: NEW → ASSIGNED
Priority: -- → P1
Comment on attachment 8909287 [details] Implement handling of RSA-PSS signatures on certificates Martin Thomson [:mt:] has approved the revision. https://phabricator.services.mozilla.com/D66#3246
Attachment #8909287 - Flags: review+
Hello Daiki, are you able to check this in? Please see the commit log for how we usually format the commit comment, we usually have the bug number, a short summary, and the reviewer on the first line, this makes short commit logs easy to read. See https://hg.mozilla.org/projects/nss/ Thanks!
Status: ASSIGNED → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Target Milestone: --- → 3.34
Blocks: 1341302
Blocks: 1341316
See Also: → 1415171
Is there a way to disable this capability? Discussions regarding the policies of CAs issuing such certificates - and the acceptable (and strictly technically enforced) encodings of parameters is unresolved. It is especially important as support is introduced that any DER violations are strictly rejected, which the existing decoders fail to do so.
This bug covers not only issuing certificates, but also signing and verification; by "disable this capability", what do you mean specifically? Would it be sufficient to make certutil generate unrestricted certificate by default, as Hubert suggested on bug 1341316?
Verifying should be disabled due to the ecosystem harm that an incomplete or non-strict implementation poses.
Getting the ideal solution should not interfere with deploying PSS. If NSS was the major validator of PSS signatures, Ryan's argument would hold a lot of weight (and we always need to look at the harm of accepting too permissive and entry). This would particularly be true if no one in the ecosystem did strict implementations. In this case, however, not accepting PSS signatures at all is a greater harm as it slows down the eventual phasing out of PKCS #1. We definitely need to keep the need for strict validation open until that gets fixed. bob
Unfortunately, I'd like to push back on the assessment that greater harm is done by not-enabling then enabling-while-weak. As Martin captured in https://tools.ietf.org/html/draft-thomson-postel-was-wrong , there's a cycle where the permissiveness results in entrenched errors. This is also an open question for Mozilla policy on the acceptable encodings, to avoid situations like https://wiki.mozilla.org/SecurityEngineering/Removing_Compatibility_Workarounds_in_mozilla::pkix from expanding. I appreciate the desire to see greater adoption of RSA-PSS. I'm not sure that the assessment is well-founded (compare to the arguments for RSA-OAEP), as the literature has captured, but we probably shouldn't use this bug to litigate that preference. To the extent it matters to the certificate consuming ecosystem, I'd like to suggest that an implementation that is overly lax in what it accepts is a risk to the ecosystem, and should be seen as a bug that would block on-by-default. My concrete suggestions are: 1) We should define what is an acceptable policy for RSA-PSS signatures in certificates that, much like the TLS WG does, avoids the needless parameterized complexity that has caused multiple implementors concern. I've been discussing with Adam Langley and David Benjamin on the approaches for this, and examining what CryptoAPI does (as Windows supports PSS) 2) We should ensure NSS is suitably strict in enforcing those, as part of good hygiene. Given Martin's contributions to TLS and the RSA-PSS "don't have 20 different permutable configurations" work in that WG, and Gerv's contributions to the Mozilla PKI policy, setting N-I to them for awareness on this bug and the ecosystem implications.
Flags: needinfo?(martin.thomson)
Flags: needinfo?(gerv)
Ryan, thanks for the notice. Right now, TLS 1.3 requires use of PSS, so we do have PSS enabled. We can turn it off in TLS if it comes to that. mozilla::pkix relies on NSS configuration for signaling, but we have something planned for signaling that separately. I've not looked into the encoding issues in any detail. I'd like to understand more about the compatibility risks and would be interested in learning what you conclude in terms of plans. We only have TLS 1.3 enabled in pre-release Firefox, and I think that we have some opportunity to address concerns regarding sloppiness. I think that we'd be willing to break servers that deploy PSS, given that there are currently very few (if not zero) in existence. But I agree that we'd want a plan in place, or we risk leaving this in its current state indefinitely.
Flags: needinfo?(martin.thomson)
re: Comment 11: Right, setting aside the debate about whether signalling support for RSA-PSS in TLS 1.3 should also imply for its support in certificates (and noting how it couples cryptographic implementation details between what has historically been separate layers - i.e. the Section 9.1 discussion)), I think it's important to note that for TLS 1.3, there are only three "RSA-PSS" combinations documented (0x0804, 0x0805, 0x0806), with very explicit requirements around the construction/use. That is, it requires mgf1, that the mgf1 digest function be equivalent to the signing digest function, and that the salt length is equal to the digest length. Regrettably, as applied to PKIX, RSA-PSS is a complete disaster. That's because of the RFC 8017 structure (and its preceding versions) RSASSA-PSS-params ::= SEQUENCE { hashAlgorithm [0] HashAlgorithm DEFAULT sha1, maskGenAlgorithm [1] MaskGenAlgorithm DEFAULT mgf1SHA1, saltLength [2] INTEGER DEFAULT 20, trailerField [3] TrailerField DEFAULT trailerFieldBC } All fields are left flexible - meaning it's valid to have a PKIX certificate with a hashAlgorithm of SHA-256, a maskGenAlgorithm of MGF1-SHA384, a saltLength of 20. That is obviously non-sensical, but that's why the need to be precise in what is a valid construct (within NSS) This is further amplified by DER-decoding ambiguity. The NSS parser used (QuickDER) is relatively liberal, and doesn't enforce that default values are omitted from encoding (per DER), nor does the current implementation enforce trailerField. This is within https://hg.mozilla.org/projects/nss/diff/84e886ea090e/lib/cryptohi/secvfy.c which uses SEC_QuickDERDecodeItem with SECKEY_RSAPSSParamsTemplate and sec_RSAPSSParamsToMechanism, and then https://hg.mozilla.org/projects/nss/diff/84e886ea090e/lib/cryptohi/seckey.c - which fully support the above described 'silliness', and don't have checks for things like trailerField. I don't mean to suggest that RSA-PSS support is not valuable (although its value may be overstated), but to highlight that the silliness of permutations that the TLS WG wisely avoided (by explicitly specifying the permutations and entirely avoiding the encoding issues of 8017) unfortunately manifest here. Because this is now exposed through the SGN_/VFY_ APIs, it's also now surfaced to the 'legacy' API callers of NSS, without any sanity checks. It's not yet exposed through mozilla::pkix, as https://dxr.mozilla.org/mozilla-central/source/security/pkix/lib/pkixverify.cpp#69 does not yet switch on RSA-PSS, and that's needed by https://dxr.mozilla.org/mozilla-central/source/security/pkix/lib/pkixbuild.cpp#204 , but that's why I suggest disabling the VFY_ code until these issues are worked out.
For what it's worth, the discussion from folks here has been: Option 1) Add a new set of OIDs akin to the TLS algorithm IDs, for which have no parameters (as they are optional, this means explicitly *not* present), and for which have sensible combinations. That is, you could imagine rsa-pss-notabadidea-sha256 which means a hashAlgorithm of SHA-256, MGF-1 with SHA-256, saltLen of 32. You can repeat this for -sha384 and -sha512. Use OIDs, rather than permutations of hellacious extensibility in the parameters. What does this take: Interest from WG (most likely LAMPs), an I-D, defining those OIDs, and implementing those OIDs (which would require changes to both NSS and CryptoAPI) Option 2) Define, via policy (hi Gerv!) what an acceptable set of permutations are for PKIX certificates, which effectively align with what TLS has done (hashAlg == mgf-hash-alg, mgf == mgf-1, saltlen = len(hashAlg hash) What does this take: It would take defining by policy the exact DER representation of such signatureAlgorithms (see https://github.com/mozilla/pkipolicy/issues/38 for related), so that it's completely unambiguous for CAs, and similarly enforcing this in code - notably, the VFY_ APIs. This ensures that the issues that have plagued RSA-SSA-PKCSv1_5 (such as NULL vs OPTIONAL params) can be suitably avoided. Our preference would be Option 1, but the pragmatic reality is that it likely means Option 2 is the viable path.
Thanks Ryan, As for the problem with signaling support for signature algorithms up the chain, the current plan is to publish a separate I-D that documents a new extension that is like signature_algorithms, but overrides signature_algorithms when it comes to signatures used up the chain. That is, it would describe what the PKIX library accepts as opposed to TLS. It would inherit the same value if the extension was absent - which isn't ideal - but we have to make some concessions to badward compatibility (that was a typo there, but I decided to keep it). Based on your Option 1 (in comment 13), that draft would probably want to reference whatever work you come up with to define non-crazy OIDs for PSS rather than a specific profile of the existing PSS. It might be possible to limit what we need to strict profiles { SHA-X, MGF-1+SHA-X, len(SHA-X) }, but only if we end up stuck with Option 2. That would requiring nailing down the permutations in libraries like moz::pkix and probably not including a definition for SHA-1 (to avoid the default value morass). Also, we should reject an incorrect trailerField, which we should do anyway. I agree that allocating cleaner OIDs seems like the best plan, but I will still request that we do some of the ground-work here so that NSS won't generate keys with bad parameters, even though it might use the existing OIDs, and that NSS will reject signatures with bad parameters (where bad = a hodge-podge of mismatched values).
I am very happy to be advised by the assembled company of crypto experts as to what changes, if any, need to be made to Mozilla policy. :-) Gerv
Flags: needinfo?(gerv)
(In reply to Ryan Sleevi from comment #12) > I don't mean to suggest that RSA-PSS support is not valuable (although its > value may be overstated) Microsoft AD deployments use RSA-PSS signatures on certificates (with regular RSA keys) for four years by default now. Firefox cannot be deployed in such environments as it won't be able to verify certificates of internal services. (In reply to Martin Thomson [:mt:] from comment #14) > Based on your Option 1 (in comment 13), that draft would probably want to > reference whatever work you come up with to define non-crazy OIDs for PSS > rather than a specific profile of the existing PSS. It might be possible to > limit what we need to strict profiles { SHA-X, MGF-1+SHA-X, len(SHA-X) }, > but only if we end up stuck with Option 2. I'd be against doing that. There already are widely deployed implementations (schannel, OpenSSL 1.0.1) that can verify rsa-pss signatures made with rsa keys that will Just Work™ in case a new PKIX profile limits the acceptable parameters. And for public CAs we have cert.sh and CT to make sure that we don't have to deal in practice with a hodgepodge of parameters. > I agree that allocating cleaner OIDs seems like the best plan, but I will > still request that we do some of the ground-work here so that NSS won't > generate keys with bad parameters AFAICT that's already done - signature hash selects MGF1 hash, at least on certutil level (In reply to Gervase Markham [:gerv] from comment #15) > I am very happy to be advised by the assembled company of crypto experts as > to what changes, if any, need to be made to Mozilla policy. :-) Posted "Mozilla RSA-PSS policy" to dev-tech-crypto@lists.mozilla.org to discuss it.
(In reply to Hubert Kario from comment #16) > I'd be against doing that. There already are widely deployed implementations > (schannel, OpenSSL 1.0.1) that can verify rsa-pss signatures made with rsa > keys that will Just Work™ in case a new PKIX profile limits the acceptable > parameters. And for public CAs we have cert.sh and CT to make sure that we > don't have to deal in practice with a hodgepodge of parameters. There is no sensible reason to support the hodgepodge parameters. It is both unneccessary complexity and poses real ecosystem risk - both external and internal deployments. Similarly, deferring to CT to "make sure everyone gets it right" is a poor justification to avoid doing it correct in the implementation. I think this may stem from a disagreement about whether the RFC, as written, is sensible - but I also don't see any desire to support RFC 6170, which despite having an RFC is equally a hodgepodge of unnecessary complexity. At a product level, decisions need to be made not based on 'implement all the features' but 'make good decisions' - the approach of the former is what caused massive security issues such as Heartbleed, and the complexity introduced by such approaches are exactly why the TLS 1.3 WG is still struggling with middleboxes (who, due to complexity in previous versions, unfortunately botched very important details) > > > I agree that allocating cleaner OIDs seems like the best plan, but I will > > still request that we do some of the ground-work here so that NSS won't > > generate keys with bad parameters > > AFAICT that's already done - signature hash selects MGF1 hash, at least on > certutil level Note that you snipped an important part of Martin's reply, which is > > and that NSS will reject signatures with bad parameters (where bad = a hodge-podge of mismatched values). My request - and one I will keep pushing on and encouraging a broad consensus decision, if necessary - is that support for RSA-PSS be disabled until that is implemented. I think the significant risk introduced by enabling this, without such mitigation and robustness, is a harm to the ecosystem and an indicator that the support is 'not yet ready'. This is like landing code without unit tests - yes, we can say the code 'works' based on reading it, but it's not complete, nor can we be confident it will work in practice. Adding support for new algorithms without learning from the mistakes of past algorithms is, in my view, an incomplete implementation - and disabling by default is a sensible approach until the feature is truly complete. > > (In reply to Gervase Markham [:gerv] from comment #15) > > I am very happy to be advised by the assembled company of crypto experts as > > to what changes, if any, need to be made to Mozilla policy. :-) > > Posted "Mozilla RSA-PSS policy" to dev-tech-crypto@lists.mozilla.org to > discuss it. Did you mean to discuss this on https://groups.google.com/forum/#!forum/mozilla.dev.security.policy ? That is where Policy discussions take place.
(In reply to Ryan Sleevi from comment #17) > (In reply to Hubert Kario from comment #16) > > > I agree that allocating cleaner OIDs seems like the best plan, but I will > > > still request that we do some of the ground-work here so that NSS won't > > > generate keys with bad parameters > > > > AFAICT that's already done - signature hash selects MGF1 hash, at least on > > certutil level > > Note that you snipped an important part of Martin's reply, which is > > > > and that NSS will reject signatures with bad parameters (where bad = a hodge-podge of mismatched values). > > My request - and one I will keep pushing on and encouraging a broad > consensus decision, if necessary - is that support for RSA-PSS be disabled > until that is implemented. I think the significant risk introduced by > enabling this, What exact risk? Just the possibility that certificates with "bad" parameters get deployed? Why do you define it as "significant" when the whole situation won't persist for longer than few months, half a year at worst? Do you have examples of some public CAs that plan to use/support RSA-PSS in such time-frames? > without such mitigation and robustness, is a harm to the > ecosystem and an indicator that the support is 'not yet ready'. This is like > landing code without unit tests - yes, we can say the code 'works' based on > reading it, but it's not complete, nor can we be confident it will work in > practice. Adding support for new algorithms without learning from the > mistakes of past algorithms is, in my view, an incomplete implementation - > and disabling by default is a sensible approach until the feature is truly > complete. NSS will not accept a RSA signature unless it's a correct PKCS#1 v1.5 or RSA-PSS signature, that means that the only way to "exploit" this issue is to force signer to sign some attacker controlled data. And in such situations there are much bigger issues at hand. IOW, it's only a problem if other things fail, not in isolation. > > (In reply to Gervase Markham [:gerv] from comment #15) > > > I am very happy to be advised by the assembled company of crypto experts as > > > to what changes, if any, need to be made to Mozilla policy. :-) > > > > Posted "Mozilla RSA-PSS policy" to dev-tech-crypto@lists.mozilla.org to > > discuss it. > > Did you mean to discuss this on > https://groups.google.com/forum/#!forum/mozilla.dev.security.policy ? That > is where Policy discussions take place. no, I first wanted to discuss it in smaller and more focused forum, but yes, ultimately it will need to be discussed on m.d.s.p
(In reply to Hubert Kario from comment #18) > What exact risk? > > Just the possibility that certificates with "bad" parameters get deployed? > Why do you define it as "significant" when the whole situation won't persist > for longer than few months, half a year at worst? Do you have examples of > some public CAs that plan to use/support RSA-PSS in such time-frames? 1) You have made it clear you disagree with the goals - of strict checking - so there is no expectation that it will be fixed in “a few months”. Half a year IS a long time to ship buggy code. 2) Given that certificates, once issued, represent a challenge to remove trust - a compatibility risk - this is no different than introducing a new API that you will break in a few months. Such a compatibility violation would be unacceptable for an API, but that is exactly what the consequence would be. > NSS will not accept a RSA signature unless it's a correct PKCS#1 v1.5 or > RSA-PSS signature, that means that the only way to "exploit" this issue is > to force signer to sign some attacker controlled data. And in such > situations there are much bigger issues at hand. I am uncertain if there is just confusion communicating, but I have already provided you code pointers that demonstrate what you stated is false. It absolutely will accept improperly encoded signatures for both types. For PKCS#1.5 it allows both explicit-null and missing parameters. For PSS, it is more liberal than Bernie Sanders in what it will accept as valid. Changing this to be strict is a compat break. We should not be shipping code we know we will break compat with, especially not intentionally. > > Did you mean to discuss this on > > https://groups.google.com/forum/#!forum/mozilla.dev.security.policy ? That > > is where Policy discussions take place. > > no, I first wanted to discuss it in smaller and more focused forum, but yes, > ultimately it will need to be discussed on m.d.s.p Given that it needs to be discussed there, and given the ecosystem implications, it seems far more productive to avoid having to duplicate the discussion or try to determine when to move it (and the discussion forming that can happen). It would be great to just have the discussion on m.d.s.p. which has far greater industry participation.
(In reply to Ryan Sleevi from comment #19) > (In reply to Hubert Kario from comment #18) > > What exact risk? > > > > Just the possibility that certificates with "bad" parameters get deployed? > > Why do you define it as "significant" when the whole situation won't persist > > for longer than few months, half a year at worst? Do you have examples of > > some public CAs that plan to use/support RSA-PSS in such time-frames? > > 1) You have made it clear you disagree with the goals - of strict checking - > so there is no expectation that it will be fixed in “a few months”. Half a > year IS a long time to ship buggy code. That's not true. I don't disagree that this is an issue, I do think that it needs to be fixed. It's something that we will be working on, so I don't see how there is expectation that it won't be fixed. I do not consider it to be a blocker to ship basic or preliminary support for RSA-PSS. > 2) Given that certificates, once issued, represent a challenge to remove > trust - a compatibility risk - this is no different than introducing a new > API that you will break in a few months. Such a compatibility violation > would be unacceptable for an API, but that is exactly what the consequence > would be. and who would create such certificates? who would use such certificates? (please note that mozilla::pkix does _not_ support rsa-pss even if NSS has it enabled so they won't work in Firefox anyway) > > NSS will not accept a RSA signature unless it's a correct PKCS#1 v1.5 or > > RSA-PSS signature, that means that the only way to "exploit" this issue is > > to force signer to sign some attacker controlled data. And in such > > situations there are much bigger issues at hand. > > I am uncertain if there is just confusion communicating, Likely > but I have already > provided you code pointers that demonstrate what you stated is false. I meant it as forged signatures, not as signatures with "bad" parameters. > Changing this to be strict is a compat break. We should not be shipping code > we know we will break compat with, especially not intentionally. such compatibility breaks with the intention of hardening or security fixes is something that NSS does all the time, it's ok as long as it is documented
See Also: → 1423557
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: