I wanted to have alternative choices than Nvidia for high power GPUs. Then the more I thought about it, the more it made sense to rent cloud services for AI/ML workloads and lesser powered ones for gaming. The only use cases I could come up with for wanting high-end cards are 4k gaming (a luxury I can't justify for infrequent use) or for PC VR which may still be valid if/when a decent OLED (or mini-OLED) headset is available--the Sony PSVR2 with PC adapter is pretty close. The Bigscreen Beyond is also a milestone/benchmark.
Don't rent a GPU for gaming, unless you're doing something like a full-on game streaming service. +10ms isn't much for some games, but would be noticeable on plenty.
IMO you want those frames getting rendered as close to the monitor as possible, and you'd probably have a better time with lower fidelity graphics rendered locally. You'd also get to keep gaming during a network outage.
I don't even think network latency is the real problem, it's all the buffering needed to encode a game's output to a video stream and keep it v-synced with a network-attached display.
I've tried game streaming under the best possible conditions (<1ms network latency) and it still feels a little off. Especially shooters and 2D platformers.
Yeah - there's no way to play something like Overwatch/Fornite on a streaming service and have a good time. The only things that seems to be ok is turned based or platformers.
I haven't decided/pulled-the-trigger but the Intel ARC series are giving the AMD parts a good run for the money.
The only concern is how well the new Intel drivers work (full support for DX12) with older titles which are continuously being improved (for DX11, 10, and some for 9 others via emulation).
There's likely some deep discounting of Intel cards because of how bad the drivers were at launch and the prices may not stay so low once things are working much better.