PSA (for the nth time): about:config is not a supported way of configuring Firefox, so if you tweak features with about:config, don't be surprised if those tweaks stop working without warning.
Kind of an aside, I mostly use Chrome but I thought I'd give Firefox a go again, Everyone says use tree style tabs but I couldn't get them to display properly without going to about:config to enable custom css and creating a new css file somewhere deep in the file system. It's stuff like that that makes everyone use Chrome.
That said, they're admittedly terrible about keeping their documentation updated, letting users know about added/depreciated settings, and they've even been known to go in and modify settings after you've explicitly changed them from defaults, so the PSA isn't entirely unjustified.
"Two other forms of advanced configuration allow even further customization: about:config preference modifications and userChrome.css or userContent.css custom style rules. However, Mozilla highly recommends that only the developers consider these customizations, as they could cause unexpected behavior or even break Firefox. Firefox is a work in progress and, to allow for continuous innovation, Mozilla cannot guarantee that future updates won’t impact these customizations."
I did a bunch of work with Verisign as a contractor back in the early 2000s and got to see some of the systems and infrastructure issuing a good portion of the world's certificates at that time. 15 years later I was at Google when they let an intermediate certificate in their SMTP certs expire and had a major GMail outage. At work last week we had a major outage related to certificate issues. Of course there are thousands upon thousands of stories like that in between.
The chains of trust you can build with PKI have been incredibly useful and instrumental to securing code, data and traffic, but the fact that it's still subject to such brittle failure modes is bemusing.
First, one of the purposes of shorter certificates is to make revocation easier in the case of misissuance. Just having certificates issued to you be shorter-lived doesn't address this, because the attacker can ask for a longer-lived certificate.
Second, creating a new browser wouldn't address the issue because sites need to have their certificates be acceptable to basically every browser, and so as long as a big fraction of the browser market (e.g., Chrome) insists on certificates being shorter-lived and will reject certificates with longer lifetimes, sites will need to get short-lived certificates, even if some other browser would accept longer lifetimes.
I always felt like #1 would have better been served by something like RPKI in the BGP world. I.e. rather than say "some people have a need to handle ${CASE} so that is the baseline security requirement for everyone" you say "here is a common infrastructure for specifying exactly how you want your internet resources to be able to be used". In the case of BGP that turned into things like "AS 42 can originate 1.0.0.0/22 with maxlength of /23" and now if you get hijacked/spoofed/your BGP peering password leaks/etc it can result in nothing bad happening because of your RPKI config.
The same in web certs that could have been something like "domain.xyz can request non-wildcard certs for up to 10 days validity". Where I think certs fell apart with it is they placed all the eggs in client side revocation lists and then that failure fell to the admins to deal with collectively while the issuers sat back.
For the second note, I think that friction is part of their point. Technically you can, practically that doesn't really do much.
> "domain.xyz can request non-wildcard certs for up to 10 days validity"?
You could be proposing two things here:
(1) Something like CAA that told CAs how to behave.
(2) Some set of constraints that would be enforced at the client.
CAA does help some, but if you're concerned about misissuance you need to be concerned about compromise of the CA (this is also an issue for certificates issued by the CA the site actually uses, btw). The problem with constraints at the browser is that they need to be delivered to the browser in some trustworthy fashion, but the root of trust in this case is the CA. The situation with RPKI is different because it's a more centralized trust infrastructure.
> For the second note, I think that friction is part of their point. Technically you can, practically that doesn't really do much.
I'm not following. Say you managed to start a new browser and had 30% market share (I agree, a huge lift). It still wouldn't matter because the standard is set by the strictest major browser.
The RPKI-alike is more akin to #1, but avoids the step of trying to bother trusting compromised CAs. I.e., if a CA is compromised you revoke and regenerate CA's root keys and that's what gets distributed rather than rely on individual revocation checks for each known questionable key or just sitting back for 45 days (or whatever period) to wait for anything bad to expire.
> I'm not following. Say you managed to start a new browser and had 30% market share (I agree, a huge lift). It still wouldn't matter because the standard is set by the strictest major browser.
Same reasoning between us I think, just a difference in interpreting what it was saying. Kind of like sarcasm - a "yes, you can do it just as they say" which in reality highlights "no, you can't actually do _it_ though" type point. You read it as solely the former, I read it as highlighting the latter. Maybe GP meant something else entirely :).
That said, I'm not sure I 100% agree it's really related to the strictest major browser does alone though. E.g. if Firefox set the limit to 7 days then I'd bet people started using other browsers vs all sites began rotating certs every 7 days. If some browsers did and some didn't it'd depend who and how much share etc. That's one of the (many) reasons the browser makers are all involved - to make sure they don't get stuck as the odd one out about a policy change.
.
Thanks for Let's Encrypt btw. Irks about the renewal squeeze aside, I still think it was a net positive move for the web.
I don't feel the tradeoff for trying to to fix the idea of a rogue CA misissuing is addressed by the shorter life either though, the tradeoff isn't worth it.
The best assessment of the whole CA problem can be summed up the best by Moxie,
https://moxie.org/2011/04/11/ssl-and-the-future-of-authentic...
And, Well the create-a-browser was a joke, its what ive seen suggested for those who don't like the new rules.
With that said, given that (1) pre-certificates in the log are big and (2) lifetimes are shortening and so there will be a lot of duplicates, it seems like it would be good for someone to make a feed that was just new domain names.
(It doesn't deduplicate if the same domain name appears in multiple certificates, but it's still a substantial reduction in bandwidth compared to serving the entire (pre)certificate.)
We actually spent some time making sure that we weren't going to run into problems with browsers. However, as the OP points out, because LE had a cross-signature from an existing CA, browsers didn't have to any positive action to make LE certificates work. This was absolutely essential to getting things off the ground.
Oh, I know you all did and I remember the cross-signing. I worried that you'd get slapped down somehow, that the crappy cert companies would find a way to stop/reverse it, that the project would fizzle out, etc. I thought it was cool as hell but it seemed something so clearly good couldn't stay good but you all have only gotten better over time.
I just explained that. Basically government wants to block some specific webpage, say https://en.wikipedia.org/wiki/Nursultan_Nazarbayev. Without MITM, they'll end up with blocking the entire en.wikipedia.org domain, so citizens will lose access to a lot of information. With MITM, they'll be able to target precisely one page and I can read any other wikipedia article without issues.
And with MITM they can read literally all of your private internet traffic… That seems like a significantly worse tradeoff to just using a VPN to browse Wikipedia.
reply