Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, that's the other outcome, which is the industry getting GPL'd to death for using this. But I don't think there is an established precedent when it comes to this.

I don't know much about AI/SL but I think the logical outcome is that using a transformer to generate source code is going to result in an over-trained system that produces sections of code verbatim because of the relative size of the space of all valid programs vs the space of all possible programs. It's not like art where you get soft failure if a single pixel or word is wrong: the inclusion, exclusion, or replacement of a single instruction or symbol is enough to introduce fatal bugs into a computer program. If the system doesn't have the capacity to understand programs generally (which it probably won't if it's just a transformer), then you're going to end up with a system that spits out samples from training data that do work.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: