Hacker Newsnew | past | comments | ask | show | jobs | submit | magnat's commentslogin

Did you, by any chance, play Outer Wilds recently?


> LoRa can reach a range of up to 3 miles (4.8 km) in urban areas, and up to 10 miles (16 km) in rural areas

In mountainous area LoRa on 868MHz band reaches over 100km. Last month we had a stratospheric balloon with a Meshtastic node attached. It established direct (albeit intermittent) connection between Warsaw and Berlin.


I've tested LoRa from Hill<->Hill and Flat Land<->Flat Land and 30-60km worked with plenty of SNR left.

On 868MHz with standard 5dBi omnis.

Thesis: as long you are using one of the more robust LoRa settings it always will work as long you have LoS or at least only lightly obstructed LoS.


If you have line of sight, a simple flashlight would work for communication

A robust communication needs to work in valleys


Blinking a flashlight 1200 times per second is a bit exhausting.


What's the data transmission rate with the balloon, put of curiosity?


Hard limit is 3.5kbps with 148dB link budget (Medium Fast preset in Meshtastic [1]). In practice - a packet worth of hundred bytes every half a second or so.

[1] https://meshtastic.org/docs/overview/radio-settings/


Is that the cap of the whole system or can multiple users use it with individual caps of 3.5kbps? That's somehow both faster and slower than I expected.


If one node kept transmitting non-stop, it could push data out at 3.5kbps and no one else in that range can transmit at the same time.

However, in EU there is a legal limit of 1% duty cycle on 868MHz band and collision avoidance mechanism, meaning on average you can send a packet (up to 255 bytes) once a minute.


Thanks for the info, that's really interesting!


> I'm a c# dev with near 20 years experience

Which IDE do you use? JetBrains Rider?


Not the person you asked, but I'm in a similar boat (15 years, polyglot but a lot of C#).

I mostly use VS Code to be honest. I use VSCode for other languages and for a long time it was the only graphical editor to have good remote development (over SSH) support.

Rider has that feature now though and is pretty nice too. I typically jump over to it when I need to profile something as it integrates with dotTrace. If you're coming from full-fat Visual Studio you'll probably prefer Rider.


Rider; however that's on a Windows work machine. We are a solid way to getting a linux/mac dev env going; maybe 30% is netstandard2.0, 10% is net9, the remainder net472 (including an old school non-sdk web app on IIS). Maybe ~ million LOC in its 14 year lifespan.

My personal dev is shifting to Rust.


> if you run it at the full 262144 tokens of context youll need ~65gb of ram

What is the relationship between context size and RAM required? Isn't the size of RAM related only to number of parameters and quantization?


The context cache (or KV cache) is where intermediate results are stored. One for each output token. Its size depends on the model architecture and dimensions.

KV cache size = 2 * batch_size * context_len * num_key_value_heads * head_dim * num_layers * element_size. The "2" is for the two parts, key and value. Element size is the precision in bytes. This model uses grouped query attention, which reduces num_key_value_heads compared to a multi head attention (MHA) model.

With batch size 1 (for low-latency single-user inference), 32k context (recommended in the model card), fp16 precision:

2 * 1 * 32768 * 8 * 128 * 36 * 2 = 4.5GiB.

I think, anyway. It's hard to keep up with this stuff. :)


Yes but you can quantise the KV cache too just like you can the weights.


A 24GB GPU can run a ~30b parameter model at 4bit quantization at about 8k-12k context length before every GB of VRAM is occupied.


Not quite true. Depends on number of KV heads. GLM4 32b at IQ4 quant and Q8 context can run full context with only 20GiB VRAM.


No. Your KV cache is kept in memory also.


Whats the space complexity for context size? And who is trying to drop it into linear complexity?


I mean...where do you think context is stored?


The separate noise source is a bit of surprise here. Why is it necessary? Wouldn't RF noise produce same results?


I'm not sure what the FM demodulator produces when it's mistuned, but I'm guessing that you'd get pretty much no output, rather than white noise (since there's no frequency for the demodulator to lock onto). The problem for the user is that you wouldn't know if your batteries are dead or if you just haven't found the station. By adding a "hiss" between stations, the radio has better usability


If RTL-SDR is a good reference - when demodulating FM it produces pretty much the same noise you'd expect from a mistuned oldschool radio.


I wondered about this too, but from the linked articles it seems to be designed that way in order to make it more user friendly: when not correctly tuned to a station, it outputs the artificial white noise instead of a possibly distorted signal from a nearby frequency (or just silence if the demodulator can't lock on to anything).


It depends, if the RF frequency you use has a signal on it then it won't be random so it's not really noise. I wonder why they need a noise generator in a receiver chip though.. They're usually used for crypto stuff.


It's to provide "comfort noise" when the correlator indicates a missing or mistuned signal.

Muting the audio would make more sense -- and would certainly have been familiar to the CB[1] radio operators of the day in the form of a squelch effect -- but this chip was targeted at consumers who expected it to behave like a conventional FM radio.

1: An early incarnation of social media, for better and worse


Haha yes I know CB radio. I used it for many years and I'm still a ham. It was a lot like social media yes. Never thought of it that way. Also, BBS'es. And the combination thereof which was packet radio.

But I didn't think of this because it's an analog receiver. I thought it would just receive noise in the absence of a signal like its older brethren did.


You would hear the usual white noise between stations, just by virtue of FM reception relying on high IF gain ahead of a limiting stage. But judging from the description of the block diagram, I suspect the noise you'd hear when tuning in a station would be very unfamiliar, maybe a lot of distorted tones and whistles. I can't believe they'd have gone to so much trouble to hide it otherwise.

It's a much more interesting chip than it initially appeared to be, that's for sure.


In a conventional radio, yes, but I'll bet this approach would sound incredibly awful if mistuned.


Try SolidGoldMagikarp


Mostly for I/O, e.g. mmap requires file offset to be multiple of the page size.


It feels like an interpolated Street View imagery. There is one scene with two people between cars in a parking lot. It is the only one I have found that has objects you would expect to change over time. When exploring the scene, those people sometimes disappear altogether and sometimes teleport around, as they would when exploring Street View panoramas. You can clearly tell when you are switching between photos taken a few seconds apart.


> The team also used a CMOS camera to capture visible-light emissions from the microplasmas (...) The CMOS imagers, however, had to be placed close to the measured radiation source, reducing their applicability to remote sensing

How can it be called long-range detector, if literally the detector has to be placed at measured object?


The detector using the scattering of the infrared light emitted by the laser is long range.

They have used a second detection method with a CMOS camera that detected the fluorescence of the plasma produced by ionizing radiation.

The second short-range method was used for comparison with the investigated method, to assess its efficiency.


1. They might have used IP cores with license that forbids disclosing any technical details, including firmware implementation.

2. Manipulating RF registers could cause the device to operate outside of regulatory parameters, perhaps invalidating FCC certification for the whole device. By not disclosing how to use MAC directly, they can claim they did their best to prevent device from misbehaving.


Item 1 is certainly the case here. Espressif licensed most of their original IP from tensilica, and this is completely normal for building a commercial microcontroller. If you don't have the in-house skill to build a complete WiFi radio stack from the ground up, you simply license the IP core and roll it into your product.

Nice work by the presenters here, it's a good idea and might help the community, but the current state is not the result of Espressif trying to be jerks.


The second item has been trotted out for roughly two decades as an excuse for why router companies block open source firmwares.

In those two decades there's been no evidence the FCC or other regulatory bodies follow that theory, and no evidence that open source firmware causes "misbehavior."

Please stop talking about it like it's an issue. It isn't.


The chaos that surrounded the Flipper Zero clearly indicates that it is an issue.


There’s many devices that can get uncapped (Yaesu handhelds for example) without losing FCC certification.


"uncapped"?


To open up capabilities outside of intended function.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: