The shift is actually not that extreme in LEO (a few KHz at most).
And couldn’t the satellites mostly adjust for that, given that the relative doppler shift should be pretty constant between mobiles in the same spot beam?
Obviously it's not constant, but if it's uniform (enough) within the footprint of a single spot beam, the satellites can adjust for the global component both in their transmitter and receiver, and the mobile devices only have to compensate for (or tolerate) their local difference from that.
I don't know how much I'm allowed to say, but at least part of what they're doing is normal Cat-1 LTE that any modem that supports it will be able to pick up
As far as I understand, the timing advance only matters to compensate for differences in distance/latency between different devices (i.e. to avoid uplink transmissions to talk over each other on the same frequency).
The satellites can correct for "global" latency themselves (unless there are higher-level parts of the LTE/E-UTRA radio protocol that can't tolerate such long latencies, e.g. ARQ timers).
For GSM as a half-duplex technology, there's also the matter of devices not being able to transmit and receive at the same time, but I believe the same principle applies: As long as the timing differences between different devices in the same spot beam isn't too large, that's something the satellites could globally correct for.
The same probably applies for doppler corrections.