Why is this just now news? They already built a similar device for their Project Orion glasses. As far as I can tell, this is just the same thing but with a PC driver.
Having tried prototypes at neuroscience conferences where their team attended, I can tell you that the device was incredibly brittle (e.g. damp wrist, interference from even the metal table or a nearby computer).
As it says in the article, the device seems to be more robust, and ready for the market soon. After having used ML to tune the decoding model on many participants contributing EMG data.
Why gorilla arm? This doesn't necessarily require lifting it. There's an old video around with Zuck doing gestures while walking and he starts with his arm mostly at rest. Even in the worst case, how is it more tiring than a phone?
Gorilla arm is caused by briefly pointing at things in front of you in a repetitive manner. The problem is that this is such an easy to code, universal gesture that it creeps into every interface.
You’re correct that this was publicly announced last fall along with Orion. This is back in the news now because of the recent Nature paper demonstrating the performance of general models on new participants without additional training data. It has nothing to do with PC drivers.