I really wish developers on consoles would just accept what kind of performance consoles really offer. There isn't a single reason a game should be 30fps these days if that game tests your motor skills in any way. If you can't hit a locked 60fps target, you need to reassess your visuals.
Hearing that 2/3 PS users are switching to 60fps, away from a default of 30fps is really interesting. You can have the best screenshots in the world, but if your game doesn't run well people can tell.
IMO low latency, rock solid, 30FPS is fine but hard to pull from on a 3D engine. The issue is average FPS means almost nothing when lag happens right after dramatic inputs like rapidly turning around. Worse there’s significant latency on everything from wireless controllers to the screen.
I just wish the industry standardized on reporting and low 0.1% FPS and latency, it would have done wonders for the entire gaming industry.
Some rendering engines do smear time in various ways like Motion Blur. Ray tracing can pull it off more cheaply than rasterization but you still need to compute geometry multiple times in as single frame.
Depends on the specifics, but you can burn basically unlimited processing power shooting more rays and there’s just diminishing returns. We’re not there for games today but yesterday’s render farm is tomorrow’s GPU.
As to why it’s used, higher FPS creates artifacts on things like spinning tires. Persistence of vision means flashing something for 1/240th of a second can be surprisingly obvious. We perceive fast objects as actual blurs not just sequences of still images.
As long as people keep buying garbage, people will keep selling garbage. Can't really blame them.
Framerate aside, a good percentage of modern TVs have abysmal latency. I recently played the original Super Monkey Ball 2 on a gamecube hooked up to a CRT and was shook; that sort of precision would never fly on modern hardware.
Game mode helps, but only somewhat, it does not generally solve the issue.
For this reason, as far as tight gameplay goes, my money's on devices like the Switch.
Not all that glitters is gold. I had a horrible experience with Hollow Knight on the Switch and can't understand how anyone would enjoy that game on that console. I did a video recording and the lag playing on the console screen was easily in the 100ms (iirc; can't find it).
From what I read on Reddit, there are some people who don't notice any lag (even in the aforementioned HK), others notice it in most games. For me, even the Zelda titles have some lag (though these I play on the TV, so it's a bit unfair).
For my part, I'll stick to PC; even if it means crying a little bit on each upgrade. And that's coming from the guy who used to often game at 25fps as a kid; and still enjoys playing the odd couch game (with friends) via wired Steam inhome streaming.
If you (or others) are happy with their experience, that's fine of course ;)
The switch (especially the pre-OLED) does not have better input latency than a kinda modern OLED TV.
The problem is that we're compounding latencies that would be ok if it was just 1. Controllers? Bluetooth instead of a cable. Monitor? OLED instead of CRT. Audio? Maybe also wireless.
All together creates a sluggishness you can feel but can't pinpoint.
You have to have the best screenshots for the press, but most players want to play in 60fps or better. Games have for some time now been offering two graphics modes so that the press gets pretty images and players get fps.
Hearing that 2/3 PS users are switching to 60fps, away from a default of 30fps is really interesting. You can have the best screenshots in the world, but if your game doesn't run well people can tell.