Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I look forward to the inevitable probabilistic sub-bit machine learning models :)


Sub 1-bit has been done at least as far back as 2016 for VGG style networks (my work).

I was able to get 0.68 "effective" bits.

The idea is that in each forward pass you add noise to each weight independently drawn from normal distribution, and when you calculate snr, it's sub 1 bit. Points to the idea that a stochastic memory element can be used.

https://arxiv.org/abs/1606.01981


If noise is desirable in this way, why not to just switch back to analog computing which comes with free noise


My 0-bit model can predict if you have cancer with 99.5% accuracy. It doesn't even need input data! Don't ask about the false negative rate though.


My 0-bit, no-input model can predict if you have cancer with 99.5% accuracy and 0.5% false negative rate. Don't ask about whether the cancer cell(s) are benign.


My computation-free model can give you cancer with 100% certainty.


I assume this is ment to be a joke, but isn't this actually being worked on? (minus the probabilistic portion) https://arxiv.org/abs/2310.16795




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: