I'm baffled that PC gamers have decided that 1440p is the endgame for graphics. When I look at a 27-inch 1440p display, I see pixel edges everywhere. It's right at the edge of losing the visibility of individual pixels, since I can't perceive them at 27-inch 2160p, but not quite there yet for desktop distances.
Time marches on, and I become ever more separated from gaming PC enthusiasts.
Gaming at 2160p is just too expensive still, imo. You gotta pay more for your monitor, GPU and PSU. Then if you want side monitors that match in resolution, you're paying more for those as well.
You say PC gamers at the start of your comment and gaming PC enthusiasts at the end. These groups are not the same and I'd say the latter is largely doing ultrawide, 4k monitor or even 4k TV.
According to steam, 56% are on 1080p, 20% on 1440p and 4% on 2160p.
So gamers as a whole are still settled on 1080p, actually. Not everyone is rich.
The major drawback for PC gaming at 4k that I never see mentioned is how much heat the panels generate. Many of them generate so much heat that rely on active cooling! I bought a pair of high refresh 4k displays and combined with the PC, they raised my room to an uncomfortable temperature. I returned them for other reasons (hard to justify not returning them when I got laid off a week after purchasing them), but I've since made note of the wattage when scouting monitors.
That was earlier this year. I found a new job with a pay raise so it turned out alright. Still miss my old team though.. we've been scattered like straws in the wind.
I'm still using a 50" 1080p (plasma!) television in my living room. It's close to 15 years old now. I've seen newer and bigger TVs many times at my friends house, but it's just not better enough that I can be bothered to upgrade.
Doesn't plasma have deep blacks and color reproduction similar to OLED? They're still very good displays, and being 15 years old means it probably pre-dates the SmartTV era.
I recently recently upgraded my main monitor from 1440p x 144hz to 4K x 144hz (with lots of caveats) and I agree with your assessment. If I had not made significant compromises, it would have cost at least $500 to get a decent monitor, which most people are not willing to spend.
Even with this monitor, I'm barely able to run it with my (expensive, though older) graphics card, and the screen alarmingly flashes whenever I change any settings. It's stable, but this is not a simple plug-and-play configuration (mine requires two DP cables and fiddling with the menu + NVIDIA control panel).
Why do you need two DP cables? Is there not enough bandwidth in a single one? I use a 4k@60 display, which is the maximum my cheap Anker USB-C Hub can manage.
Reddit also seems to have some people who have managed to get 144 with FreeSync, but I've only managed 120.
Funnily enough while I was typing this Netflix caused both my monitors to blackscreen (some sort of NVIDIA reset I think) and then come back. It's not totally stable!
This is likely a cable issue. Certain cable types can't handle 4k. I had to switch from DisplayPort to HDMI with a properly rated cable to get past this in the past.
It works up until too many pixels change, basically.
Had the same issue at 4k 60fps, it mostly worked but the screen flashed black from time to time. I used the thickest cable I had lying around and it has worked fine since.
Interesting. I've been running my 4K monitor at 240Hz with HDR enabled for months and haven't had any issues with Display Stream Compression on my 4080.
For me it's a small issue with full screen (including borderless iirc) games causing the display to black out for a few seconds.
I don't think it's an issue until you notice. I only noticed because I toggle HDR for some games and at 1440p240hz, the difference is just enough to not need DSC
I don’t think that’s true anymore. I routinely find 4K/27” monitors for under $100 on Craigslist, and a 3080-equivalent is still good enough to play most games on med-high settings at 4K and ~90Hz, especially if DLSS is available.
Your hypothetical person has a 3080 but needs to crawl craigslist for a sub-100$ monitor? U guess those people exist, but idk why you'd bother with a 3080 to then buy a low refreh rate, high input latency, probably TN, low color accuracy craigslist runoff.
Not rich. Well within reach for Americans with expendable income. Mid range 16" macbook pros are in the same price ballpark as 4k gaming rigs. Or put another way costs less than a vacation for two to a popular destination.
I used to be in the '4k or bust' camp, but then I realized that I needed 1.5x scaling on a 27" display to have my UI at a comfy size. That put me right back at 1440p screen real estate and you had to deal with fractional scaling issues.
Instead, I bought a good 27" 1440p monitor, and you know what? I am not the discerning connoisseur of pixels that I thought I was. Honestly, it's fine.
I will hold out with this setup until I can get a 8k 144hz monitor and a gpu to drive it for a reasonable price. I expect that will take another decade or so.
I have a 4K 43" TV on my desk and it is about perfect for me for desktop use without scaling. For gaming, I tend to turn it down to 1080p because I like frames and don't want to pay up.
At 4K, it's like having 4 21" 1080p monitors. Haven't maximized or minimized a window in years. The sprawl is real.
I find the scaling situation with KDE is better with the Xorg X11 server than it is with Wayland. Things like Zoom will properly be scaled for me with the former.
It's true but I don't run into this issue often since most games and Windows will offer UI/Menu scaling without changing individual windows or the game itself.
I think it's less that gamers have decided it's the "endgame" and more that current gen games at good framerates at 4k require significantly more money than 1440p does, and at least to my eyes just running at native 1440p on a 1440p monitor looks much better than running an internal resolution of 1440p upscaled to 4k, even with DLSS/FSR - so just upgrading piecemeal isn't really a desirable option.
Most people don't have enough disposable income to make spending that extra amount a reasonable tradeoff (and continuing to spend on upgrades to keep up with their monitor on new games).
This is a trade-off with frame rates and rendering quality. When having to choose, most gamers prefer higher frame rate and rendering quality. With 4K, that becomes very expensive, if not impossible. 4K is 2.25 times the pixels of 1440p, which for example means you can get double the frame rate with 1440p using the same processing power and bandwidth.
In other words, the current tech just isn’t quite there yet, or not cheap enough.
Arguably 1440p is the sweet spot for gaming, but I love 4k monitors for the extra text sharpness. Fortunately DLSS and FSR upscaling are pretty good these days. At 4k, quality-mode upscaling gives you a native render resolution about 1440p, with image quality a little better and performance a little worse.
I don’t think it’s seen as the end game, it’s that if you want 120 fps (or 144, 165, or 240) without turning down your graphics settings you’re talking $1000+ GPUs plus a huge case and a couple hundreds watts higher on your power supply.
1440p hits a popular balance where it’s more pixels than 1080p but not so absurdly expensive or power hungry.
Eventually 4K might be reasonably affordable, but we’ll settle at 1440p for a while in the meantime like we did at 1080p (which is still plenty popular too).
That's more of a function of high end Nvidia gaming card prices and power consumption. PC gaming at large isn't about chasing high end graphics anyway, steam deck falls under that umbrella and so does a vast amount of multiplayer gaming that might have other priorities such as affordability or low latency/very high fps.
It's a nice compromise for semi competitive play. On 4k it'd be very expensive and most likely finicky to maintain high FPS.
Tbh now that I think about it I only really need resolution for general usage. For gaming I'm running everything but textures on low with min or max FOV depending on the game so it's not exactly aesthetic anyway. I more so need physical screen size so the heads are physically larger without shoving my face in it and refresh rate.
Its not the end game but most rather have more fps for high refresh rate on higher graphic setting than couple more pixel you won't notice unless you search for it and low fps and/or low graphic settings.
I find dual 24 inch 1440p q great compromise. Higher pixel density, decent amount of screen real estate, and nice to have an auxiliary monitor when gaming.
I run the second monitor off the IGPU so it doesn't even tax the main GPU.
I don't directly see the pixels per se like on 1080p at 27-inch at desktop distances. But I see harsh edges in corners and text is not flawless like on 2160p.
Like I said, it's on the cusp of invisible pixels.
Gamers often use antialias settings to smooth out harsh edges, whereas an inconsistent frame rate will literally cost you a game victory in many fast-action games. Many esports professionals use low graphics settings for this reason.
I've not tried but I've heard that a butter-smooth 90, 120, or 300 FPS frame rate (that is also synchronized with the display) is really wonderful in many such games, and once you experience that you can't go back. On less powerful systems it then requires making a tradeoff with rendering quality and resolution.
Time marches on, and I become ever more separated from gaming PC enthusiasts.