The artifacts created by subpixel AA are dumb and unnecessary when the pixel density is high enough for grayscale to look good. Plus, with display scaling, subpixel AA creates artifacts. (Not like display scaling itself doesn't also create artifacts - I cannot tolerate the scaling artifacts on iPad, for example)
Yep, what I was thinking weren't just direct cost savings, but the ways how what money would flow to other parts of Finnish economy had it stayed here, instead of going to America.
We're in recession, there are way too many fresh IT grads, and even experienced devs are having it hard in this job market. Ultimately this affects every area of our economy, as unemployed people aren't buying houses or spending much on services. Shifting some spending from MS licenses to developing local solutions would almost certainly have a positive impact on our whole economy, even if the solutions weren't any cheaper in the end.
It should be also noted that we have a state-funded education system. It seems incredibly dumb to use lots of public money to train programmers, and then have them sit unemployed, while using more public money to buy mediocre software written by other programmers beyond European Union.
In the UK. I know a couple who bought a really old house then spend 3 million on renovating it "period correct", even down to the paint to use plant-based materials only as that was how it was done in the 17th century.
There is pretty strong precedent for this design over in .NET land - if it was awful or notably inferior to `defer` I'm sure the Chrome engineering team would have taken notice.
C# has the advantage of being a typed language, which allows compilers and IDEs to warn in the circumstances I mentioned. JavaScript isn't a typed language, which limits the potential for such warnings.
Anyway, I didn't say it was "inferior to defer", I said that it seemed more error-prone than RAII in languages like Rust and C++.
Edit: Sorry if I'm horribly wrong (I don't use C#) but the relevant code analysis rules look like CA2000 and CA2213.
> Anyway, I didn't say it was "inferior to defer", I said that it seemed more error-prone than RAII in languages like Rust and C++.
It is, but RAII really isn't an option if you have an advanced GC, as it is lifetime-based and requires deterministic destruction of individual objects, and much of the performance of an advanced GC comes from not doing that.
It’s still difficult to get right in cases where you hold a disposable as a member. Its not obvious if disposables passed in also get disposed and what’s right depends on the situation (think a string based TextWriter getting passed in a byte-based Stream) and you will need to handle double disposes.
Further C# has destructors that get used as a last resort effort on native resources like file descriptors.
> Further C# has destructors that get used as a last resort effort on native resources like file descriptors.
True, I was going to mention that, but I saw that JS also has "finalization registries", which seem to provide finalizer support in JS, so I figured it wasn't a fundamental difference.
As a practical matter it's easy to forget in C# and it's up to you to remember. Those two analyzers are disabled by default and prone to both false positives and false negatives. They hardcoded the known behavior of a bunch of .NET classes to get it to be usable at all.
Great to read - where are we up to with regards to the long laundry list that voice control software like Talon needs?
It's interesting - if you're going to allow third-party a11y software to control your PC, you need a 'make my wayland compositor do stuff' API.
However, Wayland's intention to explicitly avoid baking specific desktop concepts onto its core protocols make this somewhat of a conflicting design req.
> However, Wayland's intention to explicitly avoid baking specific desktop concepts onto its core protocols make this somewhat of a conflicting design req.
I would say it's slightly worse. Wayland's intention was to explicitly prevent the implementation of those features in the name of security. To implement a protocol with enough flexibility to allow voice control of the general interface would necessitate walking back limitations that were heavily evangelized.
On the other hand, I'm utterly impressed how much more stable Wayland through Gnome and Plasma are over the last year or so, to the point I've switched to it as a primary desktop. They've also been adding protocols like xdg_toplevel_tag_v1 that were seemingly taboo until recently. I'm optimistic about this current batch of programmers. I think they'll manage to sort out accessibility pretty soon.
Damn that's some scope creep if I ever saw it: 'try sending Arrow frames end to end' => 'rewrite the otel pipeline in rust'. Seems like the goals of the contributors don't exactly align with the goals of the project.
Kind of a bummer - one thing i was hoping to come out of this was better Arrow ecosystem support for golang.
Really nice write-up, thanks. The issues you raise with complex typing are really nicely set out. It's such a trade-off, and you're absolutely write to claim that sometimes, simplicity trumps perfection.
reply