Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Could this become an attack vector somehow? The greatest minds could probably find a way to get a malicious payload decompressed into the output.


It's lossless, at worst you'd make the compression ratio worse for certain inputs.


With LLM based compression, could we get something like the opposite of lossless, like hallucinatory? All the original content, plus more?


How this works is the LLM predicts the probability of the next token and then an arithmetic coder turns that probability distribution into bits. So it will never hallucinate. In the worst case, when the LLM makes an outrageous prediction, you just use more bits, but it doesn't affect correctness.


Not if the compression scheme is lossless, which it is here, per my previous comment.


Presuming the software is implemented correctly, that can't happen (per the definition of "lossless"). I can imagine this happening with a careless implementation, e.g. if circumstances conspire to allow a slightly different version or configuration of the LLM to be used across compression and decompression.


LLMs are deterministic at zero temperature.


Deterministically lossy/hallucinatory.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: