... Ya know, I haven't thought about that before, but screens are a lot brighter now aren't they. I wonder how much of the dark mode thing is just that the old good defaults are now eye-burningly bright, so of course people don't like them as much.
Screens can get brighter now, but if your screen is too bright, that's entirely on you. Adjusting the brightness is trivial and you can even do it without using the buttons on the display (DDC/CI).
I have a (marginally) HDR monitor, and I run Windows 10's HDR mode. Immediately it was noticeably blander than non-HDR, but I've quite grown to like it. I find it very pleasant to use for long stretches for work and such.
When I play games I get the vivid colors and contrast so all good.
Interesting, I have an HDR monitor but never tried actually using the HDR function. It does seem to make a difference and feels nicer on the eyes. I'll have to see how it affects games too.
About 3 years ago, I actually did research about monitors before buying one, for the first time in my tech-using life (which started around 1990, as I recall). Up until that point, it was never a question of "what is the best monitor for my application?", but rather "what's the biggest, highest-resolution monitor in my (pitiful) price range, that Best Buy (or whoever) has on hand right now?"
It began for me 15 years or so ago when backlights started to make high-pitched noises and 16:10 monitors started to become less common. It’s been rough since for various reasons, but brightness not going low enough isn't something I've experienced as a widespread issue. SRGB mode is pretty dim at 120 cd/m², so monitors usually are able to go at least that low, and typically much lower.
"Too bright" depends entirely on the context. I control-tab and alt-tab like a maniac these days (might reduce that by getting a fourth display, used to think that would be excessive but the Overton window has shifted and I'm coming around on the idea).
The default UI here on HN features your username in #828282 on a background of #F6F6EF [1], a contrast ratio of 3.54:1. The up and downvote arrows are #999999 on #F6F6EF [2], a ratio of 2.62:1. On a high-brightness screen, these less-intense contrasts look great. And this is far from the only place with 'nice calm greys' that are intentionally used by designers to reduce eye strain.
I run Visual Studio in dark mode, and Windows Explorer in dark mode, and Notepad++ in dark mode, and Omron Sysmac Studio in dark mode, and Autocad Electrical in dark mode, and Alibre CAD at the default (light gray) theme, and they look great. I'm in a nice, bright office, with (4) daylight 4' T8 LED bulbs directly overhead, and a whiteboard as the backdrop to my monitors, so it's not like I'm a recluse in a dark cave. The monitor brightness is not wrong.
At least, it's not wrong until I win-right or alt-tab over an Excel document or an old version of Studio 5000, where the color profile is stuck at black text on a white background. Then I'm instantly blinded. I can't set half the monitor (just the part over the Excel spreadsheet) to the right contrast and brightness to make Excel acceptable, because then I can't see better-designed apps.
And don't get me started on opening up a movie or game after work's done. "Set your brightness so the logo is barely visible." Yeah, no. "Game of Thrones is a cinematic show and therefore you have to watch it like you’re at a cinema: in a darkened room." Wagner and Snyder are watching their productions on studio-grade OLEDs. I won't put a show on the integrated screen of my old Thinkpad or Precision laptops, I know those displays are trash (and they're small, I only put stuff that's easy to see on them), but no brightness setting on my relatively nice IPS LCDs can comfortably handle the diversity of content they're used to display.
Yeah - I'm glad Android finally has the "extra dim" mode to reduce that with oleds (only barely effective with LCDs), but it comes at the cost of awful contrast. Lower output is much better in comparison.
I like being able to see my phone in sunlight nowadays, but there have definitely been some tradeoffs.
This is something that has always boggled my mind.
>Gee, this stark white brightness is hurting my eyes, maybe I should turn down the brightness on my display...
>Nah, I'll just demand every single company completely overhaul the CSS and contrast of everything while begrudgingly suffering the eye-burning whiteness of companies that haven't yet overhauled their CSS
If everyone would design against properly calibrated monitors it would be fine, you could just set it to the standard calibration, but they don't so there is no universally good brightness you'd never need to change for. It's like webpage sizes, if everyone built and tested their UIs at the standard physical screen scaling then you would run into a lot less variance in website sizes.
If everyone had calibrated monitors and every app and website was the same brightness, yes.
Instead, it varies wildly, often for good reasons... and then switching windows exposes you to extreme shifts. E.g. switch between photograph editing and a giant white text document, nothing's gonna save you then - stuff that looks correct and good for photography is absurd when most of your screen is the same as the sun.
Calibration is one thing, perceived brightness of the whole screen with specific content is another. And it's heavily influenced by how many nits are available.
Calibration in this case isn't about providing "and so everything then looks like it's the same brightness, because it's calibrated" it's about "and so the same modifications can be applied to everything consistently, because it's all calibrated". I.e. just lowering the brightness a bit below the intended value now is impossible. One thing built on a monitor with it's brightness curve undercalibrated and one with its brightness curve overcalibrated will look two different kinds of wrong when shifted by such a transform. That would not be the case with calibrated sources, everything would shift in the same way and you're able to have it look "wrong" (i.e. darker, capped, brighter, whatever) exactly the way you want, consistently.
Taking it to the DPI example, having things built at a standardised DPI isn't about making everything appear the same physical size it's about making everything tuned against a consistent physical size for the exact same reason, default is always intended and your global adjustments are always consistently resulting in the source material being larger than intended or smaller than intended instead of "well, depends how uncalibrated the source was if it's still smaller or larger than intended".
> Calibration in this case isn't about providing "and so everything then looks like it's the same brightness, because it's calibrated" it's about "and so the same modifications can be applied to everything consistently, because it's all calibrated".
You seem to be assuming the commenter you replied to didn't know that. As I understood it, all they were saying is that advice about calibration is useless for people who don't do photo editing, but whose problem is exactly that different windows have such wildly varying brightness that switching from one to another often makes everything look either pitch dark or third-degree-interrogation light in your eyes.
Which category of people do you think there are more of? My bet is on the latter. They need... Well, if you don't want to call it "another kind of calibration", you're free to come up with another term.
Does being outside generally hurt your eyes? Most complaints about brightness of screens has to do with the contrast between it and ambient lighting. When I'm in a dark room a bright screen is going to be hard to use. In a well lit room it's not an issue. It's kind of why ambient light sensing is a nice feature.
screens output way more nits these days, I remember not being able to see a screen properly because a lamp was on due to the monitors only having 200ish nits, these days you can get +1000 nit monitors, this is especially true for HDR monitors
To play PS5 I bought a monitor instead of a TV, because the place where I had to install the screen in the rented apartment is tiny, and tiny TVs are just crap with tons of input lag. So I got a gaming LG monitor.
Many PS5 games, specially those with HDR support, offer help in adjusting the brightness, often in the form of showing a very dark and a very bright image side by side, and telling you to adjust your settings until both are visible.
I found out that no matter what I do, this never happens. In the end the best setting is when NEITHER are visible. If the dark image is visible, the screen is so bright it feels like staring into a flashlight. If the bright image is visible, the screen is so dark that I can't see the contents of the screen with my curtains open or the lights turned on. I can't wrap my head around how someone can make a screen be so crap.