Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think LTE can cope with the extreme doppler shift present when the other end is a satellite in LEO


The shift is actually not that extreme in LEO (a few KHz at most).

And couldn’t the satellites mostly adjust for that, given that the relative doppler shift should be pretty constant between mobiles in the same spot beam?


LTE can't even handle a high speed train. 1 kHz causes significant degradation

https://eudl.eu/pdf/10.1007/978-3-319-66628-0_41

doppler shift is not a constant for a stationary observer from Earth.


Obviously it's not constant, but if it's uniform (enough) within the footprint of a single spot beam, the satellites can adjust for the global component both in their transmitter and receiver, and the mobile devices only have to compensate for (or tolerate) their local difference from that.


I don't know how much I'm allowed to say, but at least part of what they're doing is normal Cat-1 LTE that any modem that supports it will be able to pick up


How are they accounting for the presumably very high timing advance?


As far as I understand, the timing advance only matters to compensate for differences in distance/latency between different devices (i.e. to avoid uplink transmissions to talk over each other on the same frequency).

The satellites can correct for "global" latency themselves (unless there are higher-level parts of the LTE/E-UTRA radio protocol that can't tolerate such long latencies, e.g. ARQ timers).

For GSM as a half-duplex technology, there's also the matter of devices not being able to transmit and receive at the same time, but I believe the same principle applies: As long as the timing differences between different devices in the same spot beam isn't too large, that's something the satellites could globally correct for.

The same probably applies for doppler corrections.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: