Poor quality analogy: should ed25519 only have been incorporated into protocols in conjunction with another cryptographic primitive? Surely requiring a hybrid with ecdsa would be more secure? Why did djb not argue for everyone using ed25519 to use a hybrid? Was he trying to reduce security?
The reason this is a poor quality analogy is that fundamentally ecdsa and ed25519 are sufficiently similar that people had a high degree of confidence that there was no fundamental weakness in ed25519, and so it's fine - whereas for PQC the newer algorithms are meaningfully mathematically distinct, and the fact that SIKE turned out to be broken is evidence that we may not have enough experience and tooling to be confident that any of them are sufficiently secure in themselves and so a protocol using PQC should use a hybrid algorithm with something we have more confidence in. And the counter to that is that SIKE was meaningfully different in terms of what it is and does and cryptographers apparently have much more confidence in the security of Kyber, and hybrid algorithms are going to be more complicated to implement correctly, have worse performance, and so on.
And the short answer seems to be that a lot of experts, including several I know well and would absolutely attest are not under the control of the NSA, seem to feel that the security benefits of a hybrid approach don't justify the drawbacks. This is a decision where entirely reasonable people could disagree, and there are people other than djb who do disagree with it. But only djb has engaged in a campaign of insinuating that the NSA has been controlling the process with the goal of undermining security.
> seem to feel that the security benefits of a hybrid approach don't justify the drawbacks.
The problem with this statement to me is that we know of at least 1/4 finalists in the post quantum cryptography challenge is broken, so it's very hard to assign a high probability that the rest of the algorithms will be secure from another decade of advancement (this is not helped by the fact that since the beginning of the contest, the lattice based methods have lost a signficant number of bits as better attacks have been discovered).
Lots of respect to both you and the author, but the rejection gives no real response to any of the issues I see raised in the document.
It failed to raise my confidence at all.
> The IESG has concluded that there were no process failures by the SEC ADs. The IESG declines to directly address the complaint on the TLS WG document adoption matter. Instead, the appellant should refile their complaint with the SEC ADs in a manner which conforms to specified process.
There's a bunch of content that's not actually the complaint, and then there's section 4 which is the actual complaint and is overwhelmingly about procedure.
> and then there's section 4 which is the actual complaint and is overwhelmingly about procedure.
Ah, yes, procedural complaints such as "The draft creates security risks." and "There are no principles supporting the adoption decision.", and "The draft increases software complexity."
I don't know what complaint you're reading, but you're working awful hard to ignore the engineering concerns presented in the one I've read and linked to.
As is made clear from the fact that those issues all link to the mailing list, these are not novel issues. They were raised during discussion, taken into account, and the draft authors concluded they were answered adequately. Complaining about them at this point is fundamentally a complaint that the process failed to take these issues into account appropriately, and the issues should be revisited. Given that this was raised to the IESG, who are not the point of contact for engineering issues, the response is focused on that. There's a mechanism Dan can use to push for engineering decisions to be reviewed - he didn't do that.
> There's a mechanism Dan can use to push for engineering decisions to be reviewed - he didn't do that.
This is the retort of every bureaucracy which fails to do the right thing, and signals to observers that procedure is being used to overrule engineering best practices. FYI.
I'm thankful for the work djb has put in to these complaints, as well as his attempts to work through process, successful or not, as otherwise I wouldn't be aware of these dangerous developments.
Excuses of any kind ring hollow in the presence of historical context around NSA and encryption standardization, and the engineering realities.
Hey, look, you're free to read the mailing list archives and observe that every issue Dan raised was discussed at the time, he just disagreed with the conclusions reached. He made a complaint to the ADs, who observed that he was using an email address with an autoresponder that asserted people may have to pay him $250 for him to read their email, and they (entirely justifiably) decided not to do that. Dan raised the issue to the next level up, who concluded that the ADs had behaved entirely reasonably in this respect and didn't comment on the engineering issues because it's not their job to in this context.
It's not a board's job to handle every engineering complaint themselves, simply because they are rarely the best suited people to handle engineering complaints. When something is raised to them it's a matter of determining whether the people whose job it is to make those decisions did so appropriately, and to facilitate review if necessary. In this case the entire procedural issue is clear - Dan didn't raise a complaint in the appropriate manner, there's still time for him to do so, there's no problem, and all the other complaints he made about the behaviour of the ADs were invalid.
They're adhering to their charter. If you show up to my manager demanding to know why I made a specific engineering decision, he's not going to tell you - that's not the process, that's not his job, he's going to trust me to make good decisions unless presented with evidence I've misbehaved.
But as has been pointed out elsewhere, the distinction between the Dual EC DRBG objections and here are massive. The former had an obvious technical weakness that provided a clear mechanism for a back door, and no technical justification for this was ever meaningfully presented, and also it wasn't an IETF discussion. The counterpoints to Dan's engineering complaints (such as they are) are easily accessible to everyone, Dan just chose not to mention them.
The complaint seems well referenced with evidence of poor engineering decisions to me.
> Dual EC DRBG ... had an obvious technical weakness that provided a clear mechanism for a back door
Removing an entire layer of well tested encryption qualifies as an obvious technical weakness to me. And as I've mentioned elsewhere in these comments, opens users up to a https://en.wikipedia.org/wiki/Downgrade_attack should flaws in the new cipher be found. There is a long history of such flaws being discovered, even after deployment. Several examples of which DJB references.
I see no cogent reason for such recklessness, and many reasons to avoid it.
Continued pointing toward "procedure" seems to cede the case.
Why don't we hybridise all crypto? We'd get more security if we required RSA+ECDSA+ED25519 at all times, right? Or is the answer that the benefits are small compared to the drawbacks? I am unqualified to provide an answer, but I suspect you are also, and the answer we have from a whole bunch of people who are qualified is that they think the benefits aren't worth it. So why is it fundamentally and obviously true for PQC? This isn't actually an engineering hill I'd die on, if more people I trust made clear arguments for why this is dangerous I'd take it very seriously, but right now we basically have djb against the entire world writing a blogpost that makes ludicrous insinuations and fails to actually engage with any of the counterarguments, and look just no.
I am curious what the costs are seen to be here. djb seems to make a decent argument that the code complexity and resource usage costs are less of an issue here, because PQ algorithms are already much more expensive/hard to implement then elliptic curve crypto. (So instead of the question being "why don't we triple our costs to implement three algorithms based on pretty much the same ideas", it's "why don't we take a 10% efficiency hit to supplement the new shiny algorithm with an established well-understood one".)
On the other hand, it seems pretty bad if personal or career cost was a factor here. The US government is, for better or worse, a pretty major stakeholder in a lot of companies. Like realistically most of the people qualified to opine on this have a fed in their reporting chain and/or are working at a company that cares about getting federal contracts. For whatever reason the US government is strongly anti-hybrid, so the cost of going against the grain on this might not feel worth it to them.
Which insinuations do you think are ludicrous? Is it not a matter of public record at this point that the NSA and NIST have lied to weaken cryptography standards?
The entirely unsupported insinuation that the customer Cisco is describing is the NSA. What's even supposed to be the motivation there? The NSA want weak crypto so they're going to buy a big pile of Ciscos that they'll never use but which will make people think it's secure? There are others, but on its own that should already be a red flag.
The article links a statement from an NSA official that explicitly says the NSA has been asking vendors for this, which seems like fairly strong support to me.
>So why is it fundamentally and obviously true for PQC? This isn't actually an engineering hill I'd die on, if more people I trust made clear arguments for why this is dangerous I'd take it very seriously, but right now we basically have djb against the entire world writing a blogpost that makes ludicrous insinuations and fails to actually engage with any of the counterarguments, and look just no.
As a response to this only, while djb's recent blog posts have adopted a slightly crackpotish writing style, PQC hybridization is not a fringe idea, and is not deployed because of djb's rants.
Over in Europe, German BSI and French ANSSI both strongly recommend hybrid schemes. As noted in the blog, previous Google and Cloudflare experiments have deployed hybrids. This was at an earlier stage in the process, but the long history of lattices that is sometimes being used as a (reasonable) argument against hybrids applied equally when those experiments were deployed, so here I'm arguing that the choice made at the time is still reasonably today, since the history hasn't changed.
Yes, there is also a more general "lots of PQC fell quite dramatically" sentiment at play that doesn't attempt to separate SIKE and MLKEM. That part I'm happy to see criticized, but I think the broader point stands. Hybrids are a reasonable position, actually. It's fine.
"The quantum-safe mechanisms recommended in this Technical Guideline are generally not yet
trusted to the same extent as the established classical mechanisms, since they have not been as
well studied with regard to side-channel resistance and implementation security. To ensure the
long-term security of a key agreement, this Technical Guideline therefore recommends the use of
a hybrid key agreement mechanism that combines a quantum-safe and a classical mechanism."
The french position, also quoting the German position:
"As outlined in the previous position paper [1], ANSSI still strongly emphasizes the necessity of hybridation1 wherever post-quantum mitigation is needed both in the short and medium term.
Indeed, even if the post-quantum algorithms have gained a lot of attention, they are still not mature
enough to solely ensure the security"
So you've constructed a strawman. Another indication of ceding the argument.
> and the answer we have from a whole bunch of people who are qualified
The ultimate job of a manager or a board is to take responsibility for the decisions of the organization. All of your comments in this thread center around abdicating that responsibility to others.
> This isn't actually an engineering hill I'd die on
Could have fooled me.
> we basically have djb against the entire world
Many of your comments indicate to me that clashing personalities may be interfering with making the right engineering decision.
If the argument is "Why adopt a protocol that may rely on a weak algorithm without any additional protection" then I think it's up to you to demonstrate why that argument doesn't apply to any other scenario as well.
"Why adopt a protocol that may rely on a weak algorithm without any additional protection"
Does not accurately represent the situation at hand. And that seems intentional.
"Why weaken an existing protocol in ways we know may be exploitable?" is a more accurate representation. And I believe the burden of evidence lies on those arguing to do so.
Another strawman. No one in this thread said Kyber was known to be weaker. Just that elliptic curve cryptography is well tested, better understood as a consequence of being used in production longer, and that removing it opens up transmissions made without both to attacks on the less widely used algorithm which would not otherwise be successful.
It really seems like you're trying not to hear what's been said.
As a friendly reminder, you're arguing with an apologist for the security-flawed approach that the NSA advocates for and wants.
There are absolutely NSA technical and psychological operations personnel who are on HN not just while at work, but for work, and this site is entirely in-scope for them to use rhetoric to try to advance their agenda, even in bad faith.
I'm not saying mjg59 is an NSA propagandist / covert influencer / astroturf / sockpuppet account, but they sure fail the duck test for sounding and acting like one.
> If you show up to my manager demanding to know why I made a specific engineering decision, he's not going to tell you
Well if your working in a standards development organisation then your manager probably should.
It looks like (in the US at least) standards development organisations have to have (and follow) very robust transparency processes to not be default-liable for individual decisions.
(Unlike most organisations, such as where where you and your manager from your scenario come from)
This is just a bureaucracy making up fake excuses. qsecretary, the autoresponder, is way less annoying than having to create a new account everywhere on each SaaS platform. At least you know your mail arrived.
Everyone has no issues forcing other people to use 2FA, which preferably requires a smartphone, but a simple reply to qsecretary is something heinous.
The $250 are for spam and everyone apart from bureaucrats who want to smear someone as a group knows that this is 1990s bravado and hyperbole.
It's still nice that it was put up for completeness. And as we know this stuff has a long sordid history of people who are proponents of weakening encryption not giving up easily.
Boeing currently has an awkward gap between the 737 and the widebodies that was previously filled by the 757 - the 737 Max 10 (which still isn't certified!) only has about two thirds of the range of the A321XLR, and a slightly lower passenger capacity. Airlines that currently have 757 fleets and who need that range are going for Airbus instead, and Boeing just doesn't have an answer for it. So while, yes, any new Boeing design is likely to be fly by wire and composite and everything, it also seems likely that it's going to try to fit that market.
The 737 Max 7, the smallest of the Max series, is longer than the 737-200, the stretched version of the original design. A brand new design is going to be able to ignore that market (which basically doesn't exist any more, the Max 7 only has a handful of orders) and scale upwards to also be a 757 replacement. But it's also going to have basically no commonality with the 737, so it's going to have to genuinely be better than the Airbus product because existing Boeing customers aren't going to benefit from being able to move existing pilots to it without retraining or benefit from common maintenance plans and so on. It obviously should be better - the A320 program started over 40 years ago, it's not that much newer than the 737 - but given Boeing's myriad series of failures in recent years and how painful the 787 program was, it's not impossible that they'll fuck this up entirely.
So given that both the basic recipe for the 737 and A320 are pretty old by now, how much "better" could a new clean sheet narrowbody realistically be, given recent in aircraft design?
And how much better would it _need_ to be, in order for large 737 operators to be convinced to place their next order for the new 7007? (yes, like nVidia, I trust they'll just add a number and start over when they run out of numbers).
I'm already suing over the security deposit - that's actually an open and shut case. In this case it's not clear that I actually suffered any personal damage so it's not immediately clear whether I have standing in a civil suit.
Did you ask a lawyer-- any lawyer-- before writing the sentence surrounding this word?
I assume the answer is yes. But I also think I remember reading a blog by you where you wasted hours attempting to reverse-engineer some hardware before finally sending it the help flag.
No, the modified copy included the same certificate page simply because it was a modified copy of the PDF with the certificate page. There's no actual way I've determined to verify the signed checksum field.
Ah, so the 'signed checksum' field isn't actually the checksum of the signed document? How odd . . . but yeah, now that I think about it, they couldn't know the hash of a document before they generate it, but they would need to in order to include it in the document, hence an impossible cycle; they must have overlooked that . . .
Remember to enumerate all breaches of law and professional rules with direct statute citations that you or legal assistance can come up with. Plan on this being read by a judge and feed them a happy path toward a conviction.
It's difficult, because the certification page is part of the PDF so obviously can't include a hash or signature of itself. And you can't just rely on a hash since someone could tamper with the file and just update the hash. A well defined way to extract the signed payload would work, but their design doesn't currently involve any cryptography so it would be a pretty wholescale redesign.
Pentium 4 was never marketed as Centrino - that came in with the Pentium M, which was very definitely not 64-bit capable (and didn't even officially have PAE support to begin with). Atom was its own microarchitecture aimed at low power use cases, which Pentium 4 was definitely not.
David was the first person to employ me as a software developer (working on Dasher, a research project he'd received a grant to turn into an accessibility tool). At the time it felt like a ludicrous amount of money, and I got to travel to a bunch of conferences at someone else's expense. It didn't quite set me up for where I am now, but it was a key part of it.
David was, well. Clearly a genius. Before I worked with him I'd been in another part of the Cavendish, doing sysadmin work for theoretical physicists including a Nobel laureate. The year I worked with David was different - more concentrated learning than I'd ever previously had.
And David was opinionated. Our review meetings would involve him asking for three different new config options based on ideas he'd had, and I'd argue him down to these making sense as a combination but not individually, but also this then being duplicative of some existing options, so if we implemented this correctly I could actually remove a preference instead of adding three more. I probably learned more from that than the coding itself.
And David could absolutely be a dick. He was very invested in his students but he was hard on them, and it was sometimes quite gendered. Probably not worse than the average Cambridge PI of the era (and definitely better than some others I knew), but that's always something that's tainted my experience.
Shortly before his death there was what was effectively a pre-memorial - a number of his past students presented their work, there was a dinner, people had an opportunity to say goodbye. I was lucky enough that the timing worked out for an existing trip to Europe, and I had the opportunity to say goodbye.
David choosing to tie up loose ends before his untimely departure was absolutely his style, and every time his name comes up I remember the fucking dreadful Sun IPX with its 256 colour display I had to use in a terrible office with the worst fluorescents I've ever seen for the first couple of months after I started. Nostalgia is weird, and I wish he was still with us.
I almost did a PhD with David but ended up working at Transversal instead, which was a company he co-founded to do some interesting work in the search engine space! It's what got me into software development as a career so I'm always grateful to him
The reason this is a poor quality analogy is that fundamentally ecdsa and ed25519 are sufficiently similar that people had a high degree of confidence that there was no fundamental weakness in ed25519, and so it's fine - whereas for PQC the newer algorithms are meaningfully mathematically distinct, and the fact that SIKE turned out to be broken is evidence that we may not have enough experience and tooling to be confident that any of them are sufficiently secure in themselves and so a protocol using PQC should use a hybrid algorithm with something we have more confidence in. And the counter to that is that SIKE was meaningfully different in terms of what it is and does and cryptographers apparently have much more confidence in the security of Kyber, and hybrid algorithms are going to be more complicated to implement correctly, have worse performance, and so on.
And the short answer seems to be that a lot of experts, including several I know well and would absolutely attest are not under the control of the NSA, seem to feel that the security benefits of a hybrid approach don't justify the drawbacks. This is a decision where entirely reasonable people could disagree, and there are people other than djb who do disagree with it. But only djb has engaged in a campaign of insinuating that the NSA has been controlling the process with the goal of undermining security.
reply