The top ten countries by air pollution listed in another comment hardly produce anything the developed world uses, they mostly export natural resources.
Very surprised to see Belgium in that list. What gives? Our electricity mix isn't too bad. We don't have much heavy industry. We're not heavy A/C users. Is it our meat consumption?
What is worse, our real consumption based per capita emission is still rising (+8%)! We love importing stuff that polluted elsewhere.
Local reasons:
Belgium has a highly industrialized economy, with significant sectors like chemicals, steel, cement, and refining which are energy-intensive and heavily reliant on fossil fuels. Antwerp’s port, with very high and very dirty maritime transport, hosts Europe’s largest petrochemical cluster.
Belgium's car culture, company cars as a tax benefit, and a well-developed fossil-fueled freight transport sector. One of the most dense road networks in the world leading to heavy road traffic and congestion.
Belgium also hosts the capital of Europe. The diplomat and CxO consultant class flies in and out of Zaventem almost daily.
Facebook is a product, Meta is a company. It was always weird to say Facebook in the context of Instagram, WhatsApp, Oculus etc. Heck who even uses Facebook now?
Even if there were no reliable studies, the precautionary principle would suggest to limit food that is highly processed and somehow of a novel form, very different from what was consumed in the past.
Anticipating a critical misreading, this does not mean that everything that was consumed in the past is automatically good.
More likely, by a huge margin, since it takes only checking a box in some software system, whereas the alternatives you mention need a lot of messy work to achieve.
States that aren't governed by rule of law have no issue with doing 'messy work'. All the pieces of the security apparatus are already there, and they are full of people who will obey.
Sorry if my question appears ignorant, but how quickly is quantum really coming? If your prior belief is "nothing practical is ever likely to come out of quantum computing", then so far there is nothing that would seriously suggest you to reconsider it.
I do not say this lightly, having followed the academic side of QC for more than a decade.
Given how seriously the spookie parts of the US government are taking it, I would treat it with a similar level of urgency. While we obviously aren't privy to everything they know, their public actions indicate that they don't think it's a hypothetical risk, and is something that we will need to be ready for within the next decade or so, given technology refresh cycles: they need to get these crypto algorithms in place now, for devices that will still be in production use well in to the 2030s.
There's also a good chance that the initial compromises of the classical algorithms won't be made public, at least initially. There are multiple nation-state actors working on the problem in secret, in addition to the academic and commercial entities working on it more publicly.
Well, that depends on whether or not you care about "store now, decrypt later". Will the info you're sending now be totally uninteresting in 5 years? Great, you're probably good. Do you still want it to be secret in 20 years? Implementing post-quantum cryptography might be urgent.
given how sticky crypto algorithms are, transitioning early is a really good idea. git is still stuck with SHA1, and there's plenty of triple DES hiding in the boring types of critical infrastructure that no one can update.
It's a reasonable question. The need for quantum resistant crypto isn't because the practical attack is right around the corner. All though, I do really enjoy the analogy of predicting when we'll get QC based crypto attacks, is similar to predicting when humans will land on the moon by looking at the altitude for the highest manned flight. It has more to do with the level of effort it takes to replace infra as critical as cryptography.
Imagine if next year, via magic wand, all the current TLS systems were completely and totally broken such that the whole of the internet using TLS became effectively unencrypted in any way? How much damage would that do to the ecosystem? But we also just invented a new protocol that works, so how long would it take to deploy it to just 50%? or to 80%? And how long would it take to replace the long tail?
I'll also leave record now decrypt later for another commenter.
The problem is more that people concentrate a lot of energy on hypothetical future quantum attacks when the actual threats have been the same since the 00s: unvalidated input, buffer overflow, bad auth, xss, injection etc.
All the big important systems are again and again vulnerable to these attacks (Cisco, M$, fortinet, etc.) - but of course those aren’t “sexy” problems to research and resolve, so we get the same stuff over and over again while everyone is gushing to protect against some science fiction crypto attacks that are and have been for the last 30 years complete fantasy.
It’s all a bit tiring to be honest.
It's a mistake to conflate cryptography, with application logic errors.
Your argument is akin to,
> The problem is that a lot of physicians concentrate on diabetes, or hypertension, when there's people who have been stabed, or shot. Constantly hearing about how heart disease is a big problem is tiring to be honest.
Also, I'm not sure what circles you run in, but if you had to ask any of my security friends if they wanted to spend time on a buffer overflow, or xss injection, or upgrading crypto primitives for quantum resistance... not a single one would pick quantum resistance.
> The problem is more that people concentrate a lot of energy on hypothetical future quantum attacks when the actual threats have been the same since the 00s
Just so I can be sure... you meant having the qbits to deploy such an attack, right? Because really the only thing stopping some of the quantum computing based attacks is number of stable qbits. They're not hypothetical attacks, they've been shown to work.
> any of my security friends if they wanted to spend time on … quantum
I commend your friends but many people in these HN threads seem to be ready to implement post-quantum encryption right now to protect against some future threats.
> you meant having the qbits to deploy such an attack, right
Yes - last time I checked it was like 3 stable qbits. It’s just so far off from being a reality i really can’t take that research seriously.
I feel like a lot of resources are wasted in this kind of research when we are still dealing with very basic problems that aren’t just as sexy to tackle.
Edit: heart disease is a real thing so your analogy is lacking - there have been 0 security risks because of quantum in the real world.
It’s more like “physicians concentrating on possible alien diseases from when we colonise the universe in the future while ignoring heart disease”
I think one reason people want to take it seriously is that to the non-expert it just looks like a scale engineering problem, and people have proven to be shockingly good at scale engineering over the past century.
I agree with you based on my following QC that we're still pretty far away from QC attacks on current crypto.
The problem is, this sort of question suffers from a lot of unknown unknowns. How confident are you that we don't see crypto broken by QC in the next 10 years? The next 20? Whatever your confidence, the answer is probably "not confident enough" because the costs of that prediction being wrong are incalculable for a lot of applications.
I'd say I'm 99% confident we will not see QC break any crypto considered secure now in the next 20 years. But I'll also say that the remaining 1% is more than enough risk that I think governments and industry should be taking major steps to address that risk.
You don’t need any QC attacks if you can far easier find exploits in the same top10 vulns that were used 20 years ago…
Industry should first address that very real and serious risk that is present _right now_ before thinking about QC.
It's a fallacy that multiple companies and governments need to be working on one thing at a time. We absolutely should be patching current vulnerabilities and implementing quantum-safe cryptography. There's no conflict between these goals.
The reality is that resources are constrained and there is definitely conflict between different goals - if you invest in one thing you can’t invest as much in another.
For me it looks like the investment in QC is way bigger than its real life impact - which is 0. Sure it can be a niche field for some more esoteric research - but it shouldn’t be the no1 topic for security researchers.
But I get that QC brings in the grant money so naturally research gravitates toward it.
The reality is that resources needed to pursue research are measured in hundreds of thousands and national security budgets are measured in billions in many countries, so your "constrained" claim is pretty much nonsense. That's not even talking about US national security budgets, which are another order of magnitude larger. The US intelligence budget in 2022 was $65.7 billion[1], and there's ample political will to fund whatever intelligence agencies such as the NSA request.
A generous CS PhD salary vs NSA 2013 estimated budget:
300,000
10,800,000,000
We can argue over exact allocation amounts but if you're really claiming the NSA can't spare even one researcher salary to research QC security I'm calling bullshit.
Yes, I guess in most of Europe the 200-250 years between Rembrandt and van Gogh is exactly when family names solidified from a simple description "the one from village X" or "son of Y" or "the one with a red hair" to become a hereditary name essentially detached from its meaning.
Also, van Gogh's popularity came from France, the work he did in France, and in France by this time family names had been standard there for a long time already (since around the 16th century), much earlier than in the Netherlands.
Do you think it is sufficiently respectful of TeX/LaTeX?
As far as proponents go, I will echo the sentiments of many people who have actually used both TeX and Typst: I have been able to accomplish many things in Typst within an hour or two by writing my own Typst code, that in LaTeX I could only accomplish after several days by cargo-culting indecipherable gibberish from years-old forum posts. I freely admit Typst can't (yet) match LaTeX's long-tail package ecosystem, but it is much more pleasant to use and easier to reason about.
I posted that link here earlier last month[0] and even I think the comment was off putting because it's off topic and just a way to put something down. "The link you posted is becoming increasingly irrelevant" doesn't seem to add much to the conversation. To the extent that it does add something (ie. comparison of typst and Tex/LaTex) it could be phrased very differently. The way it is written now also invites similarly phrased criticism the other way as seen in other replies. I agree typst is much more pleasant to write. Also yes I doubt the typst developers would call LaTex irrelevant. In fact the author specifically points out ways the Tex currently outperforms typst. (Not to imply you stated otherwise.)
>in LaTeX I could only accomplish after several days by cargo-culting indecipherable gibberish from years-old forum posts
To learn basic use of LaTeX, takes an afternoon. To understand the language fully takes "effortful learning" like any other programming language.
I believe the difference is that Typst is effectively a scripting language, not much different than many popular ones like Javascript and Python. If you already known the basics of some programming languages this allows you transfer them very easily and start writing your own scripts very quickly. You also don't need to fully understand the language to do this, the basics will mostly be enough.
In comparison Latex has a very particular way of doing computations that you will have to learn from scratch, and even then it won't be as easy or intuitive. The fact that Latex relies so much on packages for many things also means that you will have to learn their details and intricacies when trying to do interoperate with them, which makes this even more complex.
I'm ignorant of Typst. But you're missing an important problem with Latex. Packages are really fragile. The most important property of a programming language is compositionality, and Latex has so little of that that I'm generally afraid of picking up packages because I've wasted so many hours trying to get them to play nice.
I still use Latex because of the output quality and the sunk cost..but we can clearly do better
There is a trivial alternative that military strategists have been suggesting for decades. For a nation of 20+ M, having a reservist army of 1M would be feasible and make the island impossible to invade even if the rest of Earth would join forces to do that.
That's an interesting perspective, that could be used as an argument by both camps. You say more social democracy, someone else might say, more social cohesion due to shared cultural background and low immigration.
Social democracy is orthogonal to immigration policy.
You can have welfare state with close or open borders and anything in between, and you can have libertarian state with close or open borders.
For the last few years most EU countries have been going towards pretty strict immigration policy but not towards libertarianism.
Also Poland is not a good example (it's been accepting A LOT of immigration since ~2014 - more than average in EU). But that argument gets pretty detailed very quickly so unless you want to go into it - I'll leave that alone).