> I’m all for systemic change, but uses like this may end up having a chilling effect on human-created work.
Everytime this comes up, whichever party fears for it's livelihood always says something like this and ignores the other side: that rigorous enforcement activity is going to do the same thing, to human created work. Richard Stallman wrote a short story about this very issue.[1]
There are already people hurling abuse around on Twitter at artists because they think that something they made was produced with Stable Diffusion or something else.
> Everytime this comes up, whichever party fears for it's livelihood always says something like this and ignores the other side: that rigorous enforcement activity is going to do the same thing, to human created work.
I may be providing a counter-example to your argument.
At this time, I’m not advocating for anything other than self-censorship by generative AI systems (see https://news.ycombinator.com/item?id=33194623 for some initial thoughts) and, as aggregated from some of my other comments in this thread, the following:
I think that it will be important to ensure that we have symmetric information, going forward, otherwise trying to put the genie back in the bottle may just end up further disadvantaging those that try to follow the rules.
-
Society needs to change the laws regarding the preservation of value of intellectual labor, as has long been suggested.
Acting like the law doesn’t matter is a bad thing, if we are making value judgements.
-
If society doesn’t value commodity intellectual labor, then society may need to address the commoditization of intellectual labor, directly, through things like UBI / vocational rehabilitation, etc.
Similar arguments can be made about robots and the commoditization of manual labor.
If a person published a work that clearly plagiarized or violated a patent, that person would be open to legal action.
I’m all for systemic change, but uses like this may end up having a chilling effect on human-created work.