Hacker Newsnew | past | comments | ask | show | jobs | submit | more pornel's commentslogin

For an European, it's really hard to compare the value of social services, especially healthcare. The US healthcare costs seem to be highly variable depending on person/provider/medical history, and the whole pricing model is incomprehensible. Even if I could get a real quote, I still couldn't know what quality of care I'm actually going to get. Over here we keep hearing horror stories like "I stubbed my toe, and the invoice for Tylenol has bankrupted my family", and it's hard to know what's the real likelihood of such problems. It's hard to put monetary value of having a peace of mind that I can call an ambulance any time anywhere, and I won't be fighting any stupid invoices. I never have to worry about some "out of network doctor", because there is no such thing.

There are other things that are hard to judge. I have guaranteed paid holidays and sick days. People serving me in restaurants have a living(ish) wage without tips, and also have guaranteed sick days (it's absolutely disgusting to think people could be coming sick to work, especially in the food industry!). There are higher food standards, and there aren't subsidized processed corn derivatives added to everything. I have plenty of consumer protections (over here saying "I know my EU rights" in the Apple Store magically gets you an extra year of warranty).

I can live in many cities that have competent public transport, and basic shops and amenities within walking or biking distance, with roads prioritizing pedestrian safety. I could earn more and buy a luxury car, but I can already take a train, and read or have a nap while it's "fully self-driving" to the destination, bypassing traffic.


When it's time to replace, you'll get a larger battery for less money. You may be able to sell or reuse the old battery for stationary power storage.

Nissan Leaf (one of the oldest BEVs) launched with 24kWh batteries. Now it has 40kWh for the same price (even less if counting inflation), and 64kWh upgrade options.

Batteries tend to degrade 1%-2% per year, and there's no breaking point when you have to replace. In the Model S's (another old EV model with over a decade of data available) the cooling system tends to die sooner than the battery it cools.

BEV battery recycling is still in infancy, because EV batteries are lasting longer than expected.


The mistake was calling it Anna's Archive, and not Anna's AI startup.


JPEG's compression is very similar to blurring.

It converts the image into frequency domain, and reduces precision of the frequency data, with a special case when the data rounds down to zero.

However, whole-image blur crosses the 8x8 block boundaries, so it isn't perfectly aligned with the blur that JPEG uses. Lowering quality setting in JPEG (or using custom quantisation tables) will be more effective.

There's also a fundamental information-theoretic foundation for this — lower frequencies carry less information.


Demos, and links to the paper:

https://news.ycombinator.com/item?id=40896762


This is a clever algorithm for computing global illumination in real time, for any number of lights. The site shows 2D, but the algorithm can work in 3D, both screen space and world space.

It's based on a penumbra hypothesis — that rays cast for soft shadows either need high spatial resolution, or high angular resolution, but not both at the same time. This means that the storage size for all light probes is finite for infinitely long rays, and that the rays traced at different spatial and angular resolutions can be interpolated and merged without light leaks, as long as the parameters satisfy the penumbra hypothesis.

There's paper that explains it better than my comment :)


It does have invulnerability to data races. However, that guarantee applies only to data types and code in Rust.

The dangerous interaction between signals and other functions is outside of what Rust can help with.


There are several crates available which implement the dangerous parts of signal handling safely for you.


There are, but safety of their implementation is not checked by the language.

Rust doesn't have an effect system nor a similar facility to flag what code is not signal-handler-safe. A Rust implementation could just as likely call something incompatible.

Rust has many useful guarantees, and is a significant improvement over C in most cases, but let's be precise about what Rust can and can't do.


> Rust doesn't have an effect system nor a similar facility to flag what code is not signal-handler-safe.

My understanding is that a sound implementation of signal handling in Rust will require the signal handler to be Send, requiring it only has access to shared data that is Sync (safe to share between threads). I guess thread-safe does not nessecarily imply signal-safe, though.

And of course you could still call to a signal-unsafe C function but that requires an unsafe block, explicitly acknowledging that Rust's guarentees do not apply.


Signal handlers are not threads. Rust doesn't have anything that expresses the special extremely restrictive requirements of signal handlers.

A safe-Rust thread-safe Send+Sync function is allowed to call `format!()`, or `Box::new`, or drop a `Vec`, all of which will directly cause the exact same vulnerability as in SSH.

There is nothing in Rust that can say "you can't drop a Vec", and there are no tools that will let you find out whether any function you call may do it. Rust can't even statically prove that panics won't happen. Rust's panic implementation performs heap allocations, so any Rust construct that can panic is super unsafe in signal handlers.


The crates that I have looked at work by installing their own minimal signal handler which then puts a message into a channel, or otherwise delivers the message that the signal was fired to your code in a safe way.

Of course, you are still trusting that the implementation is sound.


5 is misleading. This isn't a typical model where paying customers subsidize the rest.

Cloudflare makes money from the free traffic by caching it. Caching dramatically reduces the amount of data that needs to be transferred over the backbone, which saves ISPs a ton of money, and improves latency for their customers.

From S-1 filing, under "Our Business Model" section:

https://www.sec.gov/Archives/edgar/data/1477333/000119312519...

> Given the large customer base we have and the immense amount of Internet traffic that we manage, we are able to negotiate mutually beneficial agreements with Internet Service Providers (ISPs) that allow us to place our equipment directly in their data centers, which dramatically drives down our bandwidth and co-location expenses.


How is that "making money"?


From deals with ISPs that benefit from the caching of the traffic. The whole world fetching stuff from us-east-1 is an inefficiency that Cloudflare fixes, so the more customer traffic Cloudflare can cache or generate directly from its edge network, the better deals it can make with ISPs.

The peering and co-location partnerships also have a secondary effect of expanding Cloudflare's network, which allows Cloudflare to sell services on top of that, at large scale, with low latencies.


They don't explain what event-driven means, but AFAIK it's based on diffs between frames, which highlight motion and de-emphasise overall brightness/exposure:

https://github.com/uzh-rpg/rpg_vid2e?tab=readme-ov-file#read...


I believe the important point about event cameras is the diffs are per-pixel and entirely asynchronous, so there is no concept of a frame.

The video data is simply a stream of events which encode the time and location of a brightness change. For an immediate full-scene change (like removing the lens cap), you’d get a stream that happens to update every pixel, but there’s no particular guarantee about the ordering.


That's incorrect. Event cameras are different hardware devices than your traditional camera.


It's a shame it's just based on correlation with many additives/processes together, and individual contributing factors weren't singled out.

The mechanically-separated pink sludge McNuggets may be ultra processed, but I'd like to know if I put a chicken in a blender myself, is it going to be "processed" too. Or maybe the problem is not processing, but just HFCS additives?

As a whole the definition is too close to "natural" fallacy to me.


I don't think so, they're a set of principles to go off of as a way to classify.

In the case of a fresh chicken with nothing done to it but blended, then it would be a minimally processed food.

But if you take the blended chicken and add Water, Vegetable Oil (canola Oil, Corn Oil, Soybean Oil, Hydrogenated Soybean Oil), Enriched Flour (bleached Wheat Flour, Niacin, Reduced Iron, Thiamine Mononitrate, Riboflavin, Folic Acid), Bleached Wheat Flour, Yellow Corn Flour, Vegetable Starch (modified Corn, Wheat, Rice, Pea, Corn), Salt, Leavening (baking Soda, Sodium Aluminum Phosphate, Sodium Acid Pyrophosphate, Calcium Lactate, Monocalcium Phosphate), Spices, Yeast Extract, Lemon Juice Solids, Dextrose, Natural Flavors.

Then it is considered ultra-processed.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: