> LoRa can reach a range of up to 3 miles (4.8 km) in urban areas, and up to 10 miles (16 km) in rural areas
In mountainous area LoRa on 868MHz band reaches over 100km. Last month we had a stratospheric balloon with a Meshtastic node attached. It established direct (albeit intermittent) connection between Warsaw and Berlin.
Hard limit is 3.5kbps with 148dB link budget (Medium Fast preset in Meshtastic [1]). In practice - a packet worth of hundred bytes every half a second or so.
Is that the cap of the whole system or can multiple users use it with individual caps of 3.5kbps? That's somehow both faster and slower than I expected.
If one node kept transmitting non-stop, it could push data out at 3.5kbps and no one else in that range can transmit at the same time.
However, in EU there is a legal limit of 1% duty cycle on 868MHz band and collision avoidance mechanism, meaning on average you can send a packet (up to 255 bytes) once a minute.
Not the person you asked, but I'm in a similar boat (15 years, polyglot but a lot of C#).
I mostly use VS Code to be honest. I use VSCode for other languages and for a long time it was the only graphical editor to have good remote development (over SSH) support.
Rider has that feature now though and is pretty nice too. I typically jump over to it when I need to profile something as it integrates with dotTrace. If you're coming from full-fat Visual Studio you'll probably prefer Rider.
Rider; however that's on a Windows work machine. We are a solid way to getting a linux/mac dev env going; maybe 30% is netstandard2.0, 10% is net9, the remainder net472 (including an old school non-sdk web app on IIS). Maybe ~ million LOC in its 14 year lifespan.
The context cache (or KV cache) is where intermediate results are stored. One for each output token. Its size depends on the model architecture and dimensions.
KV cache size = 2 * batch_size * context_len * num_key_value_heads * head_dim * num_layers * element_size. The "2" is for the two parts, key and value. Element size is the precision in bytes. This model uses grouped query attention, which reduces num_key_value_heads compared to a multi head attention (MHA) model.
With batch size 1 (for low-latency single-user inference), 32k context (recommended in the model card), fp16 precision:
2 * 1 * 32768 * 8 * 128 * 36 * 2 = 4.5GiB.
I think, anyway. It's hard to keep up with this stuff. :)
I'm not sure what the FM demodulator produces when it's mistuned, but I'm guessing that you'd get pretty much no output, rather than white noise (since there's no frequency for the demodulator to lock onto). The problem for the user is that you wouldn't know if your batteries are dead or if you just haven't found the station. By adding a "hiss" between stations, the radio has better usability
I wondered about this too, but from the linked articles it seems to be designed that way in order to make it more user friendly: when not correctly tuned to a station, it outputs the artificial white noise instead of a possibly distorted signal from a nearby frequency (or just silence if the demodulator can't lock on to anything).
It depends, if the RF frequency you use has a signal on it then it won't be random so it's not really noise. I wonder why they need a noise generator in a receiver chip though.. They're usually used for crypto stuff.
It's to provide "comfort noise" when the correlator indicates a missing or mistuned signal.
Muting the audio would make more sense -- and would certainly have been familiar to the CB[1] radio operators of the day in the form of a squelch effect -- but this chip was targeted at consumers who expected it to behave like a conventional FM radio.
1: An early incarnation of social media, for better and worse
Haha yes I know CB radio. I used it for many years and I'm still a ham. It was a lot like social media yes. Never thought of it that way. Also, BBS'es. And the combination thereof which was packet radio.
But I didn't think of this because it's an analog receiver. I thought it would just receive noise in the absence of a signal like its older brethren did.
You would hear the usual white noise between stations, just by virtue of FM reception relying on high IF gain ahead of a limiting stage. But judging from the description of the block diagram, I suspect the noise you'd hear when tuning in a station would be very unfamiliar, maybe a lot of distorted tones and whistles. I can't believe they'd have gone to so much trouble to hide it otherwise.
It's a much more interesting chip than it initially appeared to be, that's for sure.
It feels like an interpolated Street View imagery. There is one scene with two people between cars in a parking lot. It is the only one I have found that has objects you would expect to change over time. When exploring the scene, those people sometimes disappear altogether and sometimes teleport around, as they would when exploring Street View panoramas. You can clearly tell when you are switching between photos taken a few seconds apart.
> The team also used a CMOS camera to capture visible-light emissions from the microplasmas (...) The CMOS imagers, however, had to be placed close to the measured radiation source, reducing their applicability to remote sensing
How can it be called long-range detector, if literally the detector has to be placed at measured object?
1. They might have used IP cores with license that forbids disclosing any technical details, including firmware implementation.
2. Manipulating RF registers could cause the device to operate outside of regulatory parameters, perhaps invalidating FCC certification for the whole device. By not disclosing how to use MAC directly, they can claim they did their best to prevent device from misbehaving.
Item 1 is certainly the case here. Espressif licensed most of their original IP from tensilica, and this is completely normal for building a commercial microcontroller. If you don't have the in-house skill to build a complete WiFi radio stack from the ground up, you simply license the IP core and roll it into your product.
Nice work by the presenters here, it's a good idea and might help the community, but the current state is not the result of Espressif trying to be jerks.
The second item has been trotted out for roughly two decades as an excuse for why router companies block open source firmwares.
In those two decades there's been no evidence the FCC or other regulatory bodies follow that theory, and no evidence that open source firmware causes "misbehavior."
Please stop talking about it like it's an issue. It isn't.