Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No offense but this is a good demonstration of a common mistake tech people (especially those used to common law systems like the US) engage in when looking at laws (especially in civil law systems like much of the rest of the world): you're thinking of technicalities, not intent.

If you use Copilot to generate code by essentially just letting it autocomplete the entire code base with little supervision, yeah, sure, that might maybe fall under this law somehow.

If you use Copilot like you would use autocomplete, i.e. by letting it fill in some sections but making step-by-step decisions about whether the code reflects your intent or not, it's not functionally different from having written that code by hand as far as this law is concerned.

But looking at these two options, nobody actually does the first one and then just leaves it at that. Letting an LLM generate code and then shipping it without having a human first reason about and verify it is not by itself a useful or complete process. It's far more likely this is just a part of a process that uses acceptance tests to verify the code and then feeds the results back into the system to generate new code and so on. But if you include this context, it's pretty obvious that this indeed would describe an "AI system" and the fact there's generated code involved is just a red herring.

So no, your gotcha doesn't work. You didn't find a loophole (or anti-loophole?) that brings down the entire legal system.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: