Can't wait to find out what China hid in Riot's Vanguard rootkit for all their games. It's 100% a conspiracy theory, but nobody can convince me it's perfectly clean, or if it is, that there isn't an easy way to add some power to it quietly.
China's national security assistance law came up in the TikTok hearings. There's no reason to believe that the CCP doesn't have the legal authority to compel Riot to push an update with a backdoor to a few select high value targets.
If it is written in C you can always introduce a buffer overflow or something similar by just adding a little bit of line noise here or there and nobody can prove it was deliberate.
The vanguard drivers are signed by Microsoft, the procedure for which includes a safety audit by Microsoft.
The driver is just what the developers say it is (as with all other anti-cheat). It provides an untempered interface for the userland anti-cheat to use to get info from the kernel. Because modern cheats tend to alter the output of kernel syscalls by running in the kernel themselves.
I really don't see why anyone needs to think it's anything more than that.
If Tencent needed to spy on you so badly there's no reason kernel anti-cheats need anything to do with it...
It says something about Microsoft when they OK a known harmful bootkit that expects your computer to act like an XBox with a fancy keyboard (but not too fancy), requests invasive changes to UEFI that have broken systems, and have an overall opacity that rivals an Arthur C. Clarke Monolith.
Drivers are generally not audited by Microsoft to be signed, you only need to register your EV cert to get it signed. Cheat developers have registered their own/gotten their hands on EV certificates to create a kernel driver cheats. Anti cheat like Battleeye also download anti cheat modules at runtime to obfuscate what they do.
MS usually don't bother with driver audit... They mostly rely on EV certificate to check driver dev is a proper legal entity.
If they audit properly, they should not let the Asus AuraSync driver certified at the first time. (basically opens PORT instruction to every userland app, unristricted)
The level of sophistication that can go into a hack when sponsored by a nation-state is incredible. Just remember Stuxnet all the way back in '06 or whatever it was. Tech was a lot less advanced nearly two decades ago. It's not right, imo, to leave your safety up to this process.
EAC and other kernel-level anticheat software will dynamically load and execute signed payloads at runtime. Does Vanguard do this? If so, does Microsoft check these payloads?
If I wanted to deploy a trojan horse then the last place I would try to hide it is in an anti-cheat driver that will without any doubt be exhaustively analysed by people attempting to bypass it.
There's a ton of gamers that like to figure out how the game itself works. There's a ton of them trying to figure out how anti cheats work, sometimes to cheat, but more often because they're curious, resourceful teenagers taking it as a challenge.
Oh, I know. That's how my career was started. I made invitational in CS: Source (CAL) and then sold cheats to pay for college. My first Real Job was through a teammate.
Far more would have accepted a RAT and been deprived money than expressed genuine interest. Some did... not many. Most wanted the acclaim without the effort.
But also there's parties there with a big interest in circumventing these securities, and have done so for decades. The new release of RDR for PC (shamefully asking $50 for a 14 year old game) was cracked within days, if not earlier, of its releae.
How much shit, and how does it compare to the risk profile of, say, not wearing a five points seat belt and motorcycling helmet while driving, or a bulletproof vest when going to school, or an N95 mask literally everywhere?
Security theorists are always ready to tell us about the horrifying risks of installing kernel-level code from a vendor, but can they actually quantify the likelihood times damage those billions of installations have inflicted on Joe Random's life?
And contrast them to other risks that we regularly take in the name of comfort and convenience?
Funny that you initially used "Joe Ransom" as your example name (before your edit), as that describes one of the possible situations our friend Joe can end up in: malware that encrypts all his data and asks for a ransom to get it back.
I'm not really that interested in chasing this, but a point I do want to make: it isn't just risk.
If you want to participate in a lot of these multiplayer games that place cheating far too highly, you can't use a hypervisor. You must have gaming device and computing device. They cannot be the same.
That's fine for most, but I consider it shit. VFIO makes it possible for a big computer to make a smaller gaming one. Ask me how I know.
My greater point is I don't care if I get cheated out of a finals match. I can actually speak from experience. I prefer autonomy over my devices. I kind of want to eat poop with them. A little.
What do you mean? They burned several high value 0days on a high value target. Why wouldn't China burn a high value backdoor on a target they deem valuable enough.