macOS dropped it a few years ago, primarily because there are no Macs with non-HiDPI displays any more (reducing benefit of subpixel AA) and to improve uniformity with iOS apps running on macOS via Catalyst (iOS has never supported subpixel AA, since it doesn’t play nice with frequently adjusted orientations).
Windows I believe still uses RGB subpixel AA, because OLED monitor users still need to tweak ClearType settings to make text not look bad.
> because there are no Macs with non-HiDPI displays any more
That is not true. Apple still sells Macs that don't come with a screen, namely Mac Mini, Mac Studio, and Mac Pro. People use these with non-HiDPI monitors they already own all the time.
That's not really the most charitable reading of GP's comment. I think they very clearly mean that Apple does not sell Macs with non-HiDPI displays anymore. It's not a configuration they sell, so they don't need to support those features anymore in their current offerings.
You're right that there's nothing stopping someone from hooking up an HDMI-to-VGA adapter for their 22" Trinitron from 2001, but that doesn't mean boat anchors are a meaningful market segment. It's not a consideration for why they should retain a font rendering feature for a modern OS. You're just going to have to suffer with fuzzy fonts for your retrogaming hobby.
So what is the "configuration they sell" for the desktop Macs? The Studio Display that costs way too much for what it is, so to no one's surprise, they're not selling all that many of those? Or the Pro Display XDR for which the stand alone costs more than an entry-level Mac Mini? Sure no one will buy a $1600 monitor to use with their $600 Mac Mini. They'll get a much cheaper third-party 2K one.
Apple put all their chips behind Retina/HiDPI displays. To that end, they've got really good HiDPI resolution scaling, they no longer sell displays incapable of Retina features (in laptops or stand-alone), and they have removed features that only serve to support sub-4k displays. To Apple, 4k is the minimum standard.
If you want a 2k monitor you can buy one and hook it up, but Apple isn't interested in making it look good. It's a not new decision, either. They stopped selling Macbooks without Retina displays in 2016. They haven't supported 2k scaling since the M1 Mac Mini over 5 years ago: https://www.macworld.com/article/549493/how-to-m1-mac-1440p-...
Apple is not a budget vendor. They're a premium vendor. That's not just what other people call them. It's what they themselves profess to be. That's why you can get an Apple Thunderbolt cable for $70. To Apple, if you buy a Mac Mini, yes they're expecting you to hook it up to a 4k monitor. They expect you to be getting a Mac Mini because you want a Mac, not because you can't afford a Macbook.
Well, in my particular case, I use an M1 Max MBP with a 2K monitor that I already had when I bought the MacBook.
The problem with 27" 4K monitors is that you can't have integer scaling on them. If you set the scaling factor to 1x, everything will be too small, if you set it to 2x, everything will be huge, and macOS can't properly do fractional scaling because what it actually does is render everything into a 2x framebuffer and downscale that for output.
And besides, only supporting HiDPI displays doesn't mean one can stop striving for pixel perfection. I hate SF Symbols icons because they're sizeless. They're an abhorrent blurry mess on my monitor but they're also not all that sharp on the MacBook screen. If you notice it once, it'll haunt you forever. Sorry. They do look fine-ish on iPhones though because those use OLED displays that lack the notion of a pixel grid anyway.
> Why don’t they produce 5K/6K monitors that allow for 2x integer scaling?
Because 5K panels are probably more expensive to produce than 4K ones, and because that would only benefit Mac users since Windows can do fractional scaling just fine. I'm not sure about that but it might also be that not all GPUs used in PCs can drive monitors larger than 4K.
Even if Windows/Linux do fractional scaling fine, integer scaling is still desirable if it’s an option. Under both I still run into programs that botch fractional scaling some way or another, and given the proclivity of programs on both platforms to be built with oddball UI toolkits I don’t expect that to ever really fully resolve itself.
It’s one of the chief complaints I have with one of my mostly otherwise good x86 laptops. The 1.5x scaling the display needs has been a pain point on multiple occasions.
Since you are so sure about how Mac Mini's are used, is it 2k on 24" or 27" that these customers use?
My impressions based on limited anecdotal data I've is that most people with mac mini are using it as their secondary device (everyone has a Macbooks). Everyone is using 27" 4k monitors. 4k monitors are not that far from 2k monitors, and I think most people who are preferring to buy 2k are gamers that want higher refresh rate that their GPU can support at 2k. But gamers are not using Mac's anyway.
Viewing distance matters. ppi isn’t the target metric, it’s pixels-per-degree-of-vision that determines if a display setup is “retina”. 60 ppd is equal to 20/20 vision in a human.
My 34” monitor is only 4K but is “retina” at the viewing distance in my home office according to this calculator:
https://qasimk.io/screen-ppd/
They don't sell it as part of the configuration options.
You can separately purchase whatever monitor you wish. There are now plenty of 27" 5K monitors out there. Asus, LG (for now), Viewsonic, Kuycon, others I'm probably forgetting. They're expensive as far as monitors go, but not as expensive as the Studio Display.
Sure, but they’re not going to optimize for that case because the bulk of their Mac sales are tied up in their laptops and a significant chunk (I’d hazard a guess over 50%) of people buying Studios/Pros especially but also Minis are pairing them with Studio Displays, Pro Display XDRs, or more recently the various third-party 2x HiDPI display offerings from the likes of Asus, BenQ, Dell, and Samsung.
They’re fine to my eye, at least as good as well tuned freetype (as found on Ubuntu) as long as you’re either using a 2x HiDPI display or are using a “normal” DPI monitor with above average density (e.g. 2560x1440 27”) and have subpixel AA forced on.
Where it falls apart is at densities any lower, for example it struggles on those awful 1366x768 15.6” panels that it seemed like every other laptop was built with for a while. Similarly 24” 1080p and 32” 2560x1440 are pretty bad.
Windows I believe still uses RGB subpixel AA, because OLED monitor users still need to tweak ClearType settings to make text not look bad.