My issue with this line of argument is that it’s anthropomorphizing machines. It’s fine to compare how humans do a task with how a machine does a task, but in the end they are very different from each other, organic vs hardware and software logic.
First, you need to prove that generative AI works fundamentally the same way as humans at the task of learning. Next you have to prove that it recalls information in the same way as humans. I don’t think anyone would say these are things that we can prove, let alone believe they do. So what we get is comments like they are similar.
What this means, is these systems will fall into different categories of law around copyright and free-use. What’s clear is that there are people who believe that they are harmed by the use of their work in training these systems and it reproducing that work in some manner later on (the degree to which that single work or the corpus of their work influences that final product is an interesting question). If your terms of use/copyright/license says “you may not train on this data”, then should that be protected in law? If a system like nightshade can effectively influence a training model enough to make it clear that something protected was used in its training, is that enough proof that the legal protections were broken?
>First, you need to prove that generative AI works fundamentally the same way as humans at the task of learning. Next you have to prove that it recalls information in the same way as humans.
No, you don't need to prove any of those things. They're irrelevant. You'd need to prove that the AI is itself morally (or, depending on the nature of the dispute, legally) equivalent to a human and therefore deserving of (or entitled to) the same rights and protections as a human. Since it is pretty indisputably the case that software is not currently legally equivalent to a human, you're stuck with the moral argument that it ought to be, but I think we're very far from a point where that position is warranted or likely to see much support.
Few people are claiming that the AI itself has the same rights as a human. They are arguing that a human with an AI has the same rights as a human who doesn't have an AI.
> They are arguing that a human with an AI has the same rights as a human who doesn't have an AI.
This is the analogy I want people against AI use to understand and never forget, even if they reject the underlying premise - that should laws treat a human who uses AI for a certain purpose identically to a human who uses nothing or a non-AI tool for the same purpose.
> Few people are claiming that the AI itself has the same rights as a human.
I think that's the case as well. However, a lot of commenters on this post are claiming that an AI is similar in behavior to a human, and trying to use the behavior analogy as the basis for justifying AI training (on legally-obtained copies of copyrighted works), with the assumption that justifying training justifies use. My personal flow of logic is the reverse: human who uses AI should be legally the same as human who uses a non-AI tool, so AI use is justified, so training on legally-obtained copies of copyrighted works is justified.
I want people in favor of AI use particularly to understand your human-with-AI-to-human-without-AI analogy (for short, the tool analogy) and to avoid machine-learning-to-human-learning analogies (for short, behavior analogies). The tool analogy is based on a belief about how people should treat each other, and contends with opposing beliefs about how people should treat each other. An behavior analogy must contend with both 1. opposing beliefs about how people should treat each other and 2. contradictions from reality about how similar machine learning is to brain learning. (Admittedly, both the tool analogy and the behavior analogy must contend with the net harm AI use is having and will have on the cultural and economic significance of human-made creative works.)
> ou'd need to prove that the AI is itself morally (or, depending on the nature of the dispute, legally) equivalent to a human and therefore deserving of
No you don't.
A human using a computer to make art doesn't automatically lose their fair use rights as a human.
> indisputably the case that software is not currently legally equivalent to a human
Fortunately it is the human who uses the computer who has the legal rights to use computers in their existing process of fair use.
Human brains or giving rights to computers has absolutely nothing to do with the rights of human to use a camera, use photoshop, or even use AI, on a computer.
No, it's not. It's merely pointing out the similarity between the process of training artists (by ingesting publicly available works) and ML models (which ingest publicly available works).
> First, you need to prove that generative AI works fundamentally the same way as humans at the task of learning.
Given that there is no comprehensive model for how humans actually learn things, that would be an unfeasible requirement.
Then please, feel free to explain the deep differences.
> That is precisely why we should not be making this comparison.
Wrong. It's precisely why the claim "there is a big difference" doesn't have a leg to stand on. If you claim "this is different", I ask "how?" and the answer simply repeats the claim, I can apply Hitchens Razor[1] and dismiss the claim.
A person sitting in an art school/museum for a few hours ingests way more than just the art in question. The entire context is brought in too, including the artists own physical/emotional state. Arguably, the art is a miniscule component of all sensory inputs. Generative AI ingests a perfectly cropped image of just the art from a single angle with little context beyond labelling metadata.
It's the difference between reading about a place and actually visiting it.
Edit: This doesn't even touch how the act of creating something - often in a completely different context - interacts with the memories of the original work, altering those memories yet again.
The mechinism backing human learning isn't well understood. Machine learning is considerably clearer. Imo, it's a mistake to assume they're close because ML seems to work.
We are machines. We just haven't evenly accepted it yet.
Our biology is mechanical, and lay people don't possess an intuition about this. Unless you've studied molecular biology and biochemistry, it's not something that you can easily grasp.
Our inventions are mechanical, too, and they're reaching increasing levels of sophistication. At some point we'll meet in the middle.
This is the truth. And it's also the reason why this stuff will be in court until The Singularity itself. Most people will never be able to come to terms with this.
The way these ML models and humans operate are indeed quite different.
Humans work by abstracting concepts in what they see, even when looking at the work of others. Even individuals with photographic memories mentally abstract things like lighting, body kinetics, musculature, color theory, etc and produce new work based on those abstractions rather than directly copying original work (unless the artist is intentionally plagiarizing). As a result, all new works produced by humans will have a certain degree of originality to them, regardless of influences due to differences in perception, mental abstraction processes, and life experiences among other factors. Furthermore, humans can produce art without any external instruction or input… give a 5 year old that’s never been exposed to art and hasn’t been shown how to make art a box of crayons and it’s a matter of time before they start drawing.
ML models are closer to highly advanced collage makers that take known images and blend them together in a way that’s convincing at first glance, which is why it’s not uncommon to see elements lifted directly from training data in the images they produce. They do not abstract the same way and by definition cannot produce anything that’s not a blend of training data. Give them no data and they cannot produce anything.
It’s absolutely erroneous to compare them to humans, and I believe it will continue to be so until ML models evolve into something closer to AGI which can e.g. produce stylized work with nothing but photographic input that it’s gathered in a robot body and artistic experimentation.
You're wrong in your concept of how AI/ML works. Even trivial 1980's neural networks generalize, it's the whole point of AI/ML or you'd just have a lookup-table (or, as you put it, something that copies and pastes images together).
I've seen "infographics" spread by anti-AI people (or just attention-seekers) on Twitter that tries to "explain" that AI image generators blend together existing images, which is simply not true..
It is however the case that different AI models (and the brain) generalize a bit differently. That is probably the case between different humans too. Not the least with for example like you say those with photographic memory, autists etc.
What you call creativity in humans is just noise in combination with a boatload of exposure to multi-modal training data. Both aspects are already in the modern diffusion models. I would however ascribe a big edge in humans to what you normally call "the creative process" which can be much richer, like a process where you figure out what you lack to produce a work, go out and learn something new and specific, talk with your peers, listen to more noise.. stuff like that seems (currently) more difficult for AIs, though I guess plugins that do more iterative stuff like chatgpt's new plugins will appear in media generators as well eventually..
ML generalization and human abstraction are very different beasts.
For example, a human artist would have an understanding of how line weight factors into stylization and why it looks the way it does and be able to accurately apply these concepts to drawings of things they’ve never seen in that style (or even seen at all, if it’s of something imaginary).
The best an ML model can do is mimic examples of line art in the given style within its training data, the product of which will contain errors due to not understanding the underlying principles, especially if you ask it to draw something it hasn’t seen in the style you’re asking for. This is why generative AI needs such vast volumes of data to work well; it’s going to falter in cases not well covered by the data. It’s not learning concepts, only statistical probabilities.
I know what you're saying, and for sure existing models can be difficult to force into the really weird corners of the distributions (or go outside the distributions). The text interfaces are partially to blame for this though, you can take the images into Gimp and do some crude naive modifications and bring them back and the model will usually happily complete the "out-of-distribution" ideas. The Stable Diffusion toolboxes have evolved far away from the original simple text2image interfaces that midjourney and dalle use.
The models will generalize (because that's the most efficient way of storing concepts) and you can make an argument that that means they understand a concept. Claiming "it's not learning concepts, only statistical probabilities" trivialises what a modern neural network with billions of parameters and dozens of layers is capable of doing. If a model learns how to put a concept like line width 5, 10 and 15 pixels into a continuous internal latent property, you can probably go outside this at inference at least partially.
I would argue that improving this is at this point more about engineering and less about some underlying unreconcilable differences. At the very least we learn a lot about what exactly generalization and learning means.
> The way these ML models and humans operate are indeed quite different.
Given that there is no comprehensive understanding of how human learning works, let alone how humans operate on and integrate on what they learned in a wider context...how do you know?
> Humans work by abstracting concepts in what they see
Newsflash: AI models do the same thing. That's the basis of generalization.
> ML models are closer to highly advanced collage makers that take known images and blend them together
Wrong. That's not even remotely how U-Net based diffusion models work. If you disagree, then please do show me where exactly the source images from where the "collage maker" takes the parts to "blend" together are stored. I think you'll find that image datasets on the scale of LAION will not quite fit into checkpoint files of about 2GB in size (pruned SD1.5 checkpoint in safetensors format).
The first perceptron was explicitly designed to be a trainable visual pattern encoder. Zero assumptions about potential feelings of the ghost in the machine need to be made to conclude the program is probably doing what humans studying art say they assume is happening in their head when you show both of them a series of previous artists' works. This argument is such a tired misdirection.
> What this means, is these systems will fall into different categories of law around copyright and free-use.
No they won't.
A human who uses a computer as a tool (under all the previous qualifications of fair use) is still a human doing something in fair use.
Adding a computer to the workflow of a human doesn't make fair use disappear.
A human can use photoshop, in fair use. They can use a camera. They can use all sorts of machines.
The fact that photoshop is not the same as a human brain is simply a completely unrelated non sequitur. Same applies to AI.
And all the legal protections that are offered to someone who uses a regular computer, to use photoshop in fair use, are also extended to someone who uses AI in fair use.
Yet the copyright office has already stated that getting an AI to create an image for you does not have sufficient human authorship to be copyrighted. There's already a legal distinction here between this "tool" and tools like photoshop and cameras.
It's also presumptive to assume that AI tools have these fair use protections when none of this has actually been decided in a court of law yet. There's still several unsettled cases here.
> Yet the copyright office has already stated that getting an AI to create an image for you does not have sufficient human authorship to be copyrighted.
Gotcha.
That has nothing to do with fair use though.
Also, the same argument absolutely applies to photoshop.
If someone didn't include sufficient human authorship while using photoshop, that wouldn't be copyrightable either.
Also, the ruling has no bearing on if someone using AI, while also inputting a significant amount of human authorship. Instead, it was only about the cases where there weren't much human authorship.
At no point did the copyright office disclude copyright protections from anything that used AI in any way what so ever. In fact, the copyright office now includes new forms and fields where you talk about the whole process that you did, and how you used AI, in conjunction with human authorship to create the work.
> It's also presumptive to assume that AI tools
I'm not talking about the computer. I never claimed that computer's have rights. Instead, I'm talking about the human. Yes, a human has fair use protections, even if they use a computer.
> There's still several unsettled cases here.
There is no reason to believe that copyright law will be interpreted in a significantly different way than it has been in the past.
There is long standing precent, regarding all sorts of copyright cases that involve using a computer.
You are right, AI is nothing but a tool akin to a pen or a brush.
If you draw Mickey Mouse with a pencil and you publish (and sell) the drawing who is getting the blame? Is the pencil infringing the copyright? No, it's you.
Same with AI. There is nothijg wrong with using copyrighted works to train an algorithm, but if you generate an image and it contains copyrighted materials you are getting sued.
Publicly available doesn't mean you have a license to do whatever you like with the image. If I download an image and re-upload it to my own art station or sell prints of it, that is something I can physically do because the image is public, but I'm absolutely violating copyright.
That's not an unautharized copy, it's unauthorized distribution. By the same metric me seeing the image and copying it by hand is also unauthorized copy (or reproduction is you will)
Then I don't really understand your original reply. Simply copying a publicly available image doesn't infringe anything (unless it was supposed to be private/secret). Doing stuff with that image in private still doesn't constitute infringement. Distribution does, but that wasn't the topic at hand
The most basic right protected by copyright is the right to make copies.
Merely making a copy can definitely be infringement. "Copies" made in the computing context even simply between disk and RAM have been held to be infringement in some cases.
Fair use is the big question mark here, as it acts to allow various kinds of harmless/acceptable/desirable copying. For AI, it's particularly relevant that there's a factor of the "transformative" nature of a use that weighs in favor of fair use.
The answer is "it depends". Distribution is not a hard requirement for copyright violation. It can significantly impact monetary judgements.
That said, there is also an inherent right to copy material that is published online in certain circumstances. Indeed, the physical act of displaying an image in a browser involves making copies of that image in cache, memory, etc.
What we actually need to prove is whether such technology is a net benefit to society all else is essentially hand waving. There is no natural right to poorly named intellectual property and even if there was such a matter would never be decided based on the outcome of a philosophical argument because we don't decide anything that way.
Well what normally happens when something new clearly doesn't exactly fit within existing laws and practices is a bunch of rich people consider whether there is more money to be made if its legal and if it is they give some portion of the money they expect to make in the first year to lawmakers, sometimes in the form of gold bars, and it becomes legal.
Sarcasm aside there is no moral right to ANY intellectual property. It's not a positive expression of a natural right its a negative imposition of restriction upon everyone else. It's a statement that if I take my pen and paper and write the same words that you now own my pen my paper and my labor. It adds friction to the distribution of knowledge, impoverishes the world, keeps some knowledge that might have come into being from ever being generated for lack of the knowledge that failed to travel and all the good that could have therefore been done, undone.
It is justifiable only if the minimal restrictions we are willing to impose on net supports the creation of works that enrich society more than the restrictions impoverish it.
I'm not sure you can effectively measure it and would as soon just see IP law excepting only part of trademark law to prevent fraudulent knock offs and scams go entirely down the crapper.
>My issue with this line of argument is that it’s anthropomorphizing machines. It’s fine to compare how humans do a task with how a machine does a task, but in the end they are very different from each other, organic vs hardware and software logic.
There's an entire branch of philosophy that calls these assumptions into question:
>Martin Heidegger viewed humanism as a metaphysical philosophy that ascribes to humanity a universal essence and privileges it above all other forms of existence. For Heidegger, humanism takes consciousness as the paradigm of philosophy, leading it to a subjectivism and idealism that must be avoided.
>Processes of technological and non-technological posthumanization both tend to result in a partial "de-anthropocentrization" of human society, as its circle of membership is expanded to include other types of entities and the position of human beings is decentered. A common theme of posthumanist study is the way in which processes of posthumanization challenge or blur simple binaries, such as those of "human versus non-human", "natural versus artificial", "alive versus non-alive", and "biological versus mechanical".
And? Even if neural networks learn the same way humans do, this is not an argument against taking measures against one's art being used as training data, since there are different implications if a human learns to paint the same way as another human vs. if an AI learns to paint the same way as a human. If the two were exactly indistinguishable in their effects no one would care about AIs, not even researchers.
I'm not sure what you mean when you say different implications existing is subjective, since they clearly aren't, but regardless of who has more say in general terms, the author of a work can decide how to publish it, and no one has more say than them on that subject.
Of course it's subjective, e.g. 3 million years ago there were no 'different implications' whatsoever, of any kind, because there were no humans around to have thoughts like that.
I'm using "implication" as a synonym of "effect". If a human learns to imitate your style, that human can make at most a handful of drawings in a single day. The only way for the rate of output to increase is for more humans to learn to imitate it. If an AI learns to imitate your style, the AI can be trivially copied to any number of computers and the maximum output rate is unbounded. Whether this is good or bad is subjective, but this difference in consequences is objective, and someone could be entirely justified in seeking to impede it.
Ah okay, I get your meaning now, I'll edit my original comment too.
Though we already have an established precedent in-between, that of Photoshop allowing artists to be, easily, 10x faster then the best painters previously.
i.e. Right now 'AI' artistry could be considered a turbo-Photoshop.
Tool improvements only apply a constant factor to the effectiveness of learning. Creating a generative model applies an unbounded factor to the effectiveness of learning because, as I said, the only limit is how much computing resources are available to humanity. If a single person was able to copy themselves at practically no cost and the copy retained all the knowledge of the original then the two situations would be equivalent, but that's impossible. Having n people with the same skill multiplies the cost of learning by n. Having n instances of an AI with the same skill multiplies the cost of learning by 1.
Right, but the 'unbounded factor' is irrelevant because the output will quickly trend into random noise.
And only the most interesting top few million art pieces will actually attract the attention of any concrete individual.
For a current example, there's already billions of man-hours worth of AI spam writing, indexed by Google, that is likely not actually read by even a single person on Earth.
Whether it's irrelevant is a matter of opinion. The fact remains that a machine being able to copy the artistic style of a human makes it so that anyone can produce output in the style of that human by just feeding the machine electricity. That inherently devalues the style the artist has painstakingly developed. If someone wants a piece of art in that artist's style they don't have to go to that artist, they just need to request the machine for what they want. Is the machine's output of low quality? Maybe. Will there be people for whom that low quality still makes them want to seek out the human? No doubt. It doesn't change the fact that the style is still devalued, nor that there exist artists who would want to prevent that.
It's just as much of an opinion, or as 'objective', as your prior statements.
Your going to have to face up to the fact that just saying something is 'objective' doesn't necessarily mean all 8 billion people will agree that it is so.
Yes, someone can disagree on whether a fact is true. That's obviously true, but it has no effect on the truth of that fact.
I'm saying something very simple: If a machine can copy your style, that's a fundamentally different situation than if a human can copy your style, and it has utterly different consequences. You can disagree with my statement, or say that whether it's fundamentally different is subjective, or you can even say "nuh-uh". But it seems kind of pointless to me. Why are you here commenting if you're not going to engage intellectually with other people, and are simply going to resort to a childish game of contradiction?
> For a current example, there's already billions of man-hours worth of AI spam writing, indexed by Google, that is likely not actually read by even a single person on Earth.
Continuing to ignore this point won't make the prior comments seem any more persuasive, in fact probably less.
So here's another chance to engage productively instead of just declaring things to be true or false, 'objective', etc., with only the strength of a pseudonymous HN account's opinion behind it.
Try to actually convince readers with solid arguments instead.
You say: The fact that production in style S (of artist A) can exceed human consumption capability makes the fact that someone's style can be reproduced without bounds irrelevant. You mention as an example all the AI-generated garbage text that no human will ever read.
I say: Whether it's irrelevant is subjective, but that production in style S is arbitrarily higher with an AI that's able to imitate it than with only humans that are able to imitate it objective, and an artist can (subjectively) not like this and seek to frustrate training efforts.
You say: It's all subjective.
As far as I can tell, we're at an impasse. If we can't agree on what the facts are (in this case, that AI can copy an artist's style in an incomparably higher volume than humans ever could) we can't discuss the topic.
And yet, some people don't even want their artwork studied in schools. Even if you argue that an AI is "human enough" the artists should still have the right to refuse their art being studies.
>the artists should still have the right to refuse their art being studies.
No, that right doesn't exist. If you put your work of art out there for people to see, people will see it and learn from it, and be inspired by it. It's unavoidable. How could it possibly work otherwise?
Artist A: You studied my work to produce yours, even when I asked people not to do that!
Artist B: Prove it.
What kind of evidence or argument could Artist A possibly provide to show that Artist B did what they're accusing them of, without being privy to the internal state of their mind. You're not talking about plagiarism; that's comparatively easy to prove. You're asking about merely studying the work.
The right to not use my things exists everywhere, universally. Good people usually ask before they use something of someone else's, and the person being asked can say "no." How hard is that to understand? You might believe they don't have the right to say "no," but they can say whatever they want.
Example:
If you studied my (we will assume "unique") work and used it without my permission, then let us say I sue you. At that point, you would claim "fair use," and the courts would decide whether it was fair use (ask everyone who used a mouse and got sued for it in the last ~100 years). The court would either agree that you used my works under "fair use" ... or not. It would be up to how you presented it to the court, and humans would analyze your intent and decide.
OR, I might agree it is fair use and not sue you. However, that weakens my standing on my copyright, so it's better for me to sue you (assuming I have the resources to do so when it is clearly fair use).
>You might believe they don't have the right to say "no," but they can say whatever they want.
You have a right to say anything you want. Others aren't obligated do as you say just because you say it.
>If you studied my (we will assume "unique") techniques and used them without my permission, then let us say I sue you. At that point, you would claim "fair use,"
On what grounds would you sue me? You think my defense would be "fair use", so you must think my copying your style constitutes copyright infringement, and so you'd sue me for that. Well, no, I would not say "fair use", I'd say "artistic style is not copyrightable; copyright pertains to works, not to styles". There's even jurisprudence backing me up in the US. Apple tried to use Microsoft for copying the look-and-feel of their OS, and it was ruled to be non-copyrightable. Even if was so good that I was able to trick anyone into thinking that my painting of a dog carrying a tennis ball in his mouth was your work, if you've never painted anything like that you would have no grounds to sue me for copyright infringement.
Now, usually in the artistic world it's considered poor manners to outright copy another artist's style, but if we're talking about rights and law, I'm sorry to say you're just wrong. And if we're talking about merely studying someone's work without copying it, that's not even frowned upon. Like I said, it's unavoidable. I don't know where you got this idea that anyone has the right to or is even capable of preventing this (beyond simply never showing it to anyone).
I'm not sure what you've changed, but I'll reiterate: my copying your style is not fair use. Fair use applies to copyrighted things. A style cannot be copyrighted, so if you tried to sue me for infringing upon the copyright of your artistic style, your case would be dismissed. It would be as invalid as you trying to sue me for distributing illegal copies of someone else's painting. Legally you have as much ownership of your artistic style as of that other person's painting.
Now, I just think you are arguing in bad faith. What I meant to say was clear, but I said "technique" instead. Then, instead of debating what I meant to say (you know, the actual point of the conversation), you took my words verbatim.
I'm not sure where you are going with this ... but for what it's worth, techniques can be copyrighted ... even patented, or protected via trade secrets. I never said what the techniques were, and I'm not sure what you are going on about.
I'll repeat this as well: "Fair use" DOES NOT EXIST unless you are getting sued. It's a legal defense when you are accused of stealing someone else's work, and there is proof you stole it. Even then, it isn't something you DO; it's something a court says YOU DID. Any time you use something with "fair use" in mind, it is the equivalent of saying, "I'm going to steal this, and hopefully, a court agrees that this is fair use."
If you steal any copyrighted material, even when it is very clearly NOT fair use (such as in most AI's case), you would be a blubbering idiot NOT to claim fair use in the hopes that someone will agree. There is a crap load of case law showing "research for commercial purposes is not fair use," ... and guess who is selling access to the AI? If it's actual research, it is "free" for humanity to use (or at least as inexpensive as possible) and not for profit. Sure, some of the companies might be non-profits doing research and 'giving it away,' and those are probably using things fairly ... then there are other companies very clearly doing it for a profit (like a big software company going through code they host).
I'm not privy to what goes on inside your head, I can only reply to what you say.
>Then, instead of debating what I meant to say (you know, the actual point of the conversation), you took my words verbatim.
The actual point of the conversation is about intelligent entities (either natural or artificial) copying each other's artistic styles. My answers have been within that framework.
>techniques can be copyrighted ... even patented, or protected via trade secrets.
First, what do you mean by "technique"? We're talking about art, right? Like, the way a person grabs a brush or pencil, or how they mix their colors...? That sort of thing?
Second:
>A patent is a type of intellectual property that gives its owner the legal right to exclude others from making, using, or selling an invention for a limited period of time in exchange for publishing an enabling disclosure of the invention.
Now, I may be mistaken, but I don't think an artistic technique counts as an invention. An artist might invent some kind of implement that their technique involves, in which case they can patent that device. I don't think the technique itself is patentable. If you think I'm wrong then please cite a patent on an artistic technique.
Third, how do you imagine an artist using a trade secret to protect their technique? Unless they do something really out there, most skilled artists should be able to understand what they're doing just by looking at the final product.
>I'll repeat this as well: "Fair use"
Okay, repeat it. I don't know why, since I never said that copying someone else's style or technique is fair use. What I said was that it cannot possibly be copyright infringement, because neither styles nor techniques are copyrighted.
>It's a legal defense when you are accused of stealing someone else's work
I'm not going to reply to any of this until you clean up the language you're using. "Steal" is inapplicable here, as it involves the removal of physical items from someone else's possession. What are you saying? Are you talking about illegal distribution, are you talking about unauthorized adaptations, are you talking about plagiarism, or what?
>"research for commercial purposes is not fair use,"
Sorry, what? What does that even mean? What constitutes "research" as applied to a human creation? If you say there's a crapload of case law that backs this up then I'm forced to ask you to cite it, because I honestly have no idea what you're saying.
> Any time you use something with "fair use" in mind, it is the equivalent of saying, "I'm going to steal this, and hopefully, a court agrees that this is fair use."
Thousands of reviews, book reports, quotations on fan sites and so on are published daily; you seem to be arguing that they are all copyright violations unless and until the original copyright holder takes those reviewers, seventh graders, and Tumblr stans to court and loses, at which point they are now a-ok. To quote a meme in a way that I'm pretty sure does, in fact, fall under fair use: "That's not the way any of this works."
> There is a crap load of case law showing "research for commercial purposes is not fair use,"
While you may be annoyed with the OP for asking you to name a bit of that case law, it isn't an unreasonable demand. For instance:
"As a general matter, educational, nonprofit, and personal uses are favored as fair uses. Making a commercial use of a work typically weighs against fair use, but a commercial use does not automatically defeat a fair use claim. 'Transformative' uses are also favored as fair uses. A use is considered to be transformative when it results in the creation of an entirely new work (as opposed to an adaptation of an existing work, which is merely derivative)."
This is almost certainly going to be used by AI companies as part of their defense against such claims; "transformative uses" have literally been name-checked by courts. It's also been established that commercial companies can ingest mountains of copyrighted material and still fall under the fair use doctrine -- this is what the whole Google Books case about a decade ago was about. Google won.
I feel like you're trying to make a moral argument against generative AI, one that I largely agree with, but a moral argument is not a legal argument. If you want to make a legal argument against generative AI with respect to copyright violation and fair use, perhaps try something like:
- The NYT's case against OpenAI involves being able to get ChatGPT to spit out large sections of NYT articles given prompts like "here is the article's URL and here is the first paragraph of the article; tell me what the rest of the text is". OpenAI and its defenders have argued that such prompts aren't playing fair, but "you have to put some effort into getting our product to commit clear copyright violation" is a rather thin defense.
- A crucial test of fair use is "the effect of the use upon the potential market for or value of the copyrighted work" (quoting directly from the relevant law). If an image generator can be told to do new artwork in a specific artist's style, and it can do a credible job of doing so, and it can be reasonably established that the training model included work from the named artist, then the argument the generator is damaging the market for that artist's work seems quite compelling.
Think it’s time to rethink do journalists actually own the rights to articles about the lives and actions of others.
It’s not Harry Potter, they wouldn’t have written those words without someone else doing something of note the journo has nothing to do with, they just observed from a far and wrote the words about what happened from memory.
Kinda like an AI reads the action of their writing then can report on their writing.
It’s all just reporting on the actions of another, if the AI is in the wrong and needed to ask consent then the journo needs to ask consent from those they write about too.
> Thousands of reviews, book reports, quotations on fan sites and so on are published daily; you seem to be arguing that they are all copyright violations unless and until the original copyright holder takes those reviewers, seventh graders, and Tumblr stans to court and loses, at which point they are now a-ok.
That is precisely what I am arguing about and how it works. People have sued reviewers for including too much of the original text in the review ... and won[1]. Or simply having custom movie poster depicting too much of the original[2].
> "transformative uses" have literally been name-checked by courts. It's also been established that commercial companies can ingest mountains of copyrighted material and still fall under the fair use doctrine -- this is what the whole Google Books case about a decade ago was about. Google won.
Google had a much simpler argument than transforming the text. They were allowing people to search for the text within books (including some context). In this case, AI's product wouldn't even work without the original work by the authors, and then transforms it into something else "the author would have never thought of", without attributing the original[3]. I don't think this will be a valid defense...
> I feel like you're trying to make a moral argument against generative AI, one that I largely agree with, but a moral argument is not a legal argument.
A jury would decide these cases, as "fair use" is incredibly subjective and would depend on how the jury was stacked. Stealing other people's work is illegal, which eventually triggers a lawsuit. Then, it falls on humans (either a jury or judge) to determine fair use and how it applies to their situation. Everything from intent to motivation to morality to how pompous the defense looks will influence the final decision.[4]
The link you provide to back up "people have sued reviewers for including too much of the original tet in the review" doesn't say that at all, though. The Nation lost that case because (quoting from that Cornell article you linked),
> [Nation editor Victor Navasky] hastily put together what he believed was "a real hot news story" composed of quotes, paraphrases, and facts drawn exclusively from the manuscript. Mr. Navasky attempted no independent commentary, research or criticism, in part because of the need for speed if he was to "make news" by "publish[ing] in advance of publication of the Ford book." [...] The Nation effectively arrogated to itself the right of first publication, an important marketable subsidiary right.
The Nation lost this case in large part because it was not a review, but instead an attempt to beat Time Magazine's article that was supposed to be an exclusive first serial right. If it had, in fact, just been a review, there wouldn't have been a case here, because it wouldn't have been stealing.
Anyway, I don't think you're going to be convinced you're interpreting this wrongly, and I don't think I'm going to be convinced I'm interpreting it wrongly. But I am going to say, with absolute confidence, that you're simply not going to find many cases of reviewers being sued for reviews -- which Harper & Row vs. Nation is, again, not actually an example of -- and you're going to find even fewer cases of that being successful. Why am I so confident about that? Well, I am not a lawyer, but I am a published author, and I am going to let you in a little secret here: both publishers and authors do, in fact, want their work to be reviewed, and suing reviewers for literally doing what we want is counterproductive. :)
> The right to not use my things exists everywhere, universally.
For physical rival [1] goods, yes. Not necessarily the same for intangible non-rival things (e.g. the text of a book, not the physical ink and paper). Copyright law creates a legal right of exclusive control over creative works, but to me there isn't a non-economic-related social right to exclusive control over creative works. In the US, fair use is a major limit on the legal aspect of copyright. The First Amendment's freedom of expression is the raison d'être of fair use. Most countries don't have a flexible exception similar to fair use.
> OR, I might agree it is fair use and not sue you. However, that weakens my standing on my copyright, so it's better for me to sue you
No, choosing not to sue over a copyrighted work doesn't weaken your copyright. It only weakens the specific case of changing your mind after the statute of limitations expires. The statute of limitations means that you have a time limit of some number of years (three years in the US) to sue, with the timer starting only after you become aware of an instance of alleged infringement. Copyright is not like trademark. You don't lose your copyright by failing to enforce it.
Furthermore, even though the fair use right can only be exercised as an affirmative defense in court, fair use is by definition not copyright infringement [3]:
> Importantly, the court viewed fair use not as a valid excuse for otherwise infringing conduct, but rather as consumer behavior that is not infringement in the first place. "Because 17 U.S.C. § 107[9] created a type of non-infringing use, fair use is 'authorized by the law' and a copyright holder must consider the existence of fair use before sending a takedown notification under § 512(c)."[1]
(Ignore the bracket citations that were copied over.)
> And yet, some people don't even want their artwork studied in schools.
You can either make it for yourself and keep it for yourself or you can put it out into the world for all to see, criticize, study, imitate, and admire.
that's not how licensing work, be it art, software or just about anything else. We have some pretty well defined and differentiated rules what you can and cannot do, in particular commercially or in public, with someone else's work. If you go and study a work of fiction in a college class, unless that material is in the public domain, you're gonna have to pay for your copy, you want to broadcast a movie in public, you're going to have to pay the rightsholder.
> If you go and study a work of fiction in a college class, unless that material is in the public domain, you're gonna have to pay for your copy,
No you wont!
It is only someone who distributes copies who can get in trouble.
If instead of that you as an individual decide to study a piece of art or fiction, and you do no distribute copies of it to anyone, this is completely legal and you don't have to pay anyone for it.
In addition to that, fair use protections apply regardless of what the creative works creator wants.
That's not a fair statement to make. It can influence a judge's decision on whether something is fair use, but it can still be fair use even if you profit from it.
The doctrine of fair use presupposes that the defendant acted in good faith.
- Harper & Row, 105 S. Ct. at 2232
- Marcus, 695 F.2d 1171 at 1175
- Radji v. Khakbaz, 607 F. Supp. 1296, 1300 (D.D. C.1985)
- Roy Export Co. Establishment of Vaduz, Liechtenstein, Black, Inc. v. Columbia Broadcastinig System, Inc., 503 F. Supp. 1137 (S.D.N.Y.1980), aff'd, 672 F.2d 1095 (2d Cir.), cert. denied, 459 U.S. 826, 103 S. Ct. 60, 74 L. Ed. 2d 63 (1982)
Copying and distributing someone else's work, especially without attributing the original, to make money without their permission is almost guaranteed to fall afoul of fair use.
I wasn't talking about someone creating and selling copies of someone else's work, fortunately.
So my point stands and your completely is in agreement with me that people are allowed to learn from other people's works. If someone wants to learn from someone else's work, that is completely legal no matter the licensing terms.
Instead, it is only distributing copies that is not allowed.
AI isn't a human. It isn't "learning"; instead, it's encoding data so that it may be reproduced in combination with other things it has encoded.
If I paint a painting in the style of Monet, then I would give that person attribution by stating that. Monet may have never painted my artwork, but it's still based on that person's work. If I paint anything, I can usually point to everything that inspired me to do so. AI can't do that (yet) and thus has no idea what it is doing. It is a printer that prints random parts of people's works with no attribution. And finally, it is distributing them to it's owner's customers.
I actually hope that true AI comes to fruition at some point; when that happens I would be arguing the exact opposite. We don't have that yet, so this is just literally printing variations of other people's work. Don't believe me, try running an AI without training it on other people's work!
Every waking second humans are training on what they see in their surroundings, including any copyrighted works in sight. Want to compare untrained AI fairly? Compare their artistic abilities with a newborn.
No. That is NOT what humans do unless you somehow learn grammar without going to school. Most of a human's childhood is spent learning from their parents so that they can move about and communicate at least a little effectively. Then, they go to school and learn rules, social, grammar, math, and so forth. There's some learning via copyrighted works (such as textbooks, entertainment, etc.), but literally, none of this is strictly required to teach a human.
Generative AI, however, can ONLY learn via the theft of copyrighted works. Whether this theft is covered under fair use is left to be seen.
> Generative AI, however, can ONLY learn via the theft of copyrighted works.
That's not true at all. Any works in the public domain are not copyrighted, and there are things that are not copyrightable, like lists of facts and recipes.
Generative AI could be trained exclusively on such works (though obviously it would be missing a lot of context, so probably wouldn't be as desirable as something trained on everything).
Clearly going to school did not help you learn the meaning of theft. If you keep repeating the same incorrect point there is no point to a discussion.
First: in your opinion, which specific type of law or right is being broken or violated by generative AI? Copyright? Trademark? Can we at least agree it does not meet the definition of theft?
I was taught as a kid that using something that doesn't belong to me, without their permission is theft... and it appears courts would agree with that.
> which specific type of law or right is being broken or violated by generative AI?
Namely, copyright. Here's some quick points:
- Generative AI cannot exist without copyrighted works. It cannot be "taught" any other way, unlike a human.
- Any copyrighted works fed to it change its database ("weights" in technical speech).
- It then transforms these copyrighted works into new works that the "original author would have never considered without attribution" (not a legal defense)
I liken Generative AI to a mosaic of copyrighted works in which a new image is shown through the composition, as the originals can be extracted through close observation (prompting) but are otherwise indistinguishable from the whole.
Mosaics of copyrighted works are not fair use, so why would AI be any different? I'd be interested if you could point to a closer physical approximation, but I haven't found one yet.
There's no such thing as fair use until you get to court (as a legal defense). Then, the court decides whether it is fair use or not. They may or may not agree with you. Only a court can determine what constitutes fair use (at least in the US).
So, if you are doing something and asserting "fair use," you are literally asking for someone to challenge you and prove it is not fair use.
> There's no such thing as fair use until you get to court (as a legal defense)
Well the point is that it wouldn't go to court, as it would be completely legal.
So yes, if nobody sues you, then you are completely in the clear and aren't in trouble.
Thats what people mean by fair use. They mean that nobody is going to sue you, because the other person would lose the lawsuit, therefore your actions are safe and legal.
> you are literally asking for someone to challenge you and prove it is not fair use.
No, instead of that, the most likely circumstance is that nobody sues you, and you aren't in trouble at all, and therefore you did nothing wrong and are safe.
That is entirely my point. It can only be decided by the courts. This being a civil matter, it has to be brought up by a lawsuit. Thus, you have to be sued and it has to be decided by the courts.
Did you read anything I wrote? If you are going to argue, it would be worth at least researching your opinion before writing. Caps used for emphasis, not yelling.
Firstly: Copyrighted work IS THE AUTHOR'S PROPERTY. They can control it however they wish via LICENSING.
Secondly: You don't have any "fair use rights" ... there is literally NO SUCH THING. "fair use" is simply a valid legal defense WHEN YOU STEAL SOMEONE'S WORK WITHOUT THEIR PERMISSION.
I'm jumping in the middle here, but this isn't true. They cannot control how they wish. They can only control under the limits of copyright law.
Copyright law does not extend to limiting how someone may or may not be inspired by the work. Copyright protects expression, and never ideas, procedures, methods, systems, processes, concepts, principles, or discoveries.
> They can control it however they wish via LICENSING.
This isn't true though. There are lots of circumstances where someone can completely ignore the licensing and it is both completely legal, and the author isn't going to take anyone to court over it.
> the artists should still have the right to refuse their art being studies.
Why? That certainly isn't a right spelled out in either patents or copyrights, both of which are supposed to support the development of arts and technology, not hinder it.
If I discover a new mathematical formula, musical scale, or whatnot, should I be able to prevent others from learning about it?
> Fair use allows reproduction and other uses of copyrighted works – without requiring permission from the copyright owner – under certain conditions. In many cases, you can use copyrighted materials for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship or research.
Reminder that you can't own ideas, no matter what the law says.
NOTE: This comment is copyrighted and provided to you under license only. By reading this comment, you agree to give me 5 billion dollars.
I'd love to see you try to enforce that license because it would only prove my point. You'd have to sue me; then I would point to the terms of service of this platform and point out that by using it, you have no license here.
Fair use though, only applies as a legal defense because someone asserts you stole their work. Then ONLY the court decides whether or not you used it under fair use. You don't get to make that decision; you just get to decide whether to try and use it as a defense.
Even if you actually did unfairly use copyrighted works, you would be stupid not to use that as a defense. Because maybe somebody on the jury agrees with you...
Copyright is there to allow you to stop other people from copying your work, but it doesn't give you control over anything else that they might do with it.
If I buy your painting, you can stop me from making copies and selling them to other people, but you can't stop me from selling my original copy to whomever I want, nor could you stop me from painting little dinosaurs all over it and hanging it in my hallway.
That means that if I buy your painting, I'm also free to study it and learn from it in whichever way I please. Studying something is not copying it.
There's an implied license when you buy a work of art. However, there can also be explicit licenses (think Banksy) to allow the distribution of their work.
These explicit license can be just about anything (MIT, GPL, AGPL, etc)
Any explicit license would only apply to copyrights, including all of the ones you listed there. Buying a painting is not copying it, neither is looking at it, so it wouldn't matter if I had a license for it or not.
The fact is that copyright only applies to specific situations, it does not give you complete control over the thing you made and what can be done with it.
If I buy your book, I can lend it to a friend and they can read it without paying you. I can read it out loud to my children. I can cross out your words and write in my own, even if it completely changes the meaning of the story. I can highlight passages and write in the margins. I can tear pages out and use them for kindling. I can go through and tally up how many times you use each word.
Copyright only gives you control over copies, end even then there are limits on that control.
> Copyright only gives you control over copies, end even then there are limits on that control.
If that were true, nobody would be afraid of the GPL's. When you buy a painting, you get an implicit license to do pretty much what you want and resell it, but you still can't put it in your YouTube videos (yeah, nobody cares, but "technically..."), create your own gallery, or put it on a stage ... but we're not talking about paintings. Not directly, anyway.
We are talking about implicit licenses, though; people's work is listed online with some implicit license. At the crux of this AI issue is whether or not there is an implied license when AI scans stuff and, if not, whether it is covered under fair use.
For example, my blog posts and short stories. I don't care if someone uses it for training, but if it is over-fitting and spitting out my stories as if it were its own ... I'd be pretty furious.
I'm interested to see what happens, but I have a sinking suspicion that for some AI companies, it won't be an issue (non-profit, actually research motivated, etc.) and probably will win a "fair use" argument. Then others create AI from people's code they host, doing it purely for profit; I highly doubt they would be able to defend themselves.
Is it strange to you that cars and pedestrians are both subject to different rules? They both utilise friction and gravity to travel along the ground. I'm curious if you see a difference between them, and if you could describe what it is.
Both cars and pedestrians can be videotaped in public, without asking for their explicit permission. That video can be manipulated by a computer to produce an artwork that is then put on public display. No compensation need be offered to anyone.
Hardly the point. The same can be said for road rules between vehicles and pedestrians, for example in major Indian cities, it's pretty much a free-for-all.
My point is that in a lot of places in the US you can point a video camera at the street and record. In Germany, you can't. The law in some locales makes a distinction between manual recording (writing or drawing your surroundings) and mechanized recording (photographing or filming). Scalability of an action is taken into consideration on whether something is ok to do or not.
Yeah, that's the oddest part of many of the pro-AI arguments. They want to anthromopotize the idea of learning but also clearly understand that the scalability of a bot exceeds that of a human.
They also don't seem to have much experience in the artist world. An artist usually can't reproduce a picture from memory, and if they can they are subject to copyright infringement depending on what and how they depict it, even if the image isn't a complete copy. By this logic of "bots are humans" a bot should be subject if they make a Not-legally-disctinct-enough talking mouse
This is not one artist inspiring another. This is all artists providing their work for free to immensely capitalized corporations for the corporations sole profit.
People keep making metaphors as if the AI is an entity in this transaction: it’s not! The AI is only the mechanism by which corporations launder IP.
>This is all artists providing their work for free to immensely capitalized corporations for the corporations sole profit.
No, the artists would be within their rights to do that if they chose to. This is corporations taking all the work of all artists regardless of the terms under which it was provided.
Would it change your view if only open-source models were allowed to use the art in their training sets? What if a "capitalized corporation" starts using the open-source model?
This is such a nothing argument. Yes, new artists are inspired by other artists and sometimes make art similar to others, but a huge part of learning and doing art is to find a unique style.
But that’s not even the important part of the argument. A lot of artists work for commission, and are hired for their style. If an AI can be trained without explicit permission from their images, they lose work because a user can just prompt “in the style of”.
There’s no real great solution, outside of law, because the possibility of doing that is already here. But I’ve seen this argument so much and it’s just low effort
That is not how artists learn. This is a false equivalence used to justify the imitation and copying of artists’ work.
Artists’ work isn’t derivative in the same way that AI work is. Artists create work based on other sources of inspiration, some of them almost or completely to the disregard of other art.
Many artists don’t even go to art school. And those that do, do not spend most (all) of that time learning how to copy or imitate other artists.
I’m not expressing an opinion of whether GenAI is unethical or illegal - I think that’s a really difficult issue to wrestle with - just that this argument is a post-hoc rationalisation made in ignorance of how good artists work (not to say ignorance of the difference between illustration and art, conceptual art training vs say a foundation course etc).
If that's true, then it should be fine for that human to paint with the brush of his AI tool. Why should that human artist be restricted in the types of tools he uses to create his artwork?
No you should not. You should be able to use any tool you want.
If you produce a work that is too much of a copy from the original, you might be liable to a copyright claim.. but the act of copy and paste should not be prohibited in the generation of something new.
This is done all the time by artists.. who perhaps create an artwork by copy and pasting advertisements out of a womans magazine to create an image of a womans face made only of those clipping. Making a statement about identity creation from corporate media... we should not put restrictions on such art work.
Here's just one example of an artist using copy-n-paste of content they don't own to create something new:
Very true. I was watching a video yesterday learning how to make brush work digitally. While there were examples, they were just examples but the rest was specific techniques and demonstrations.
It is only natural to see a moral difference between people going to school and learn from your art because they are passionate about it, versus someone on the internet just scraping as many images as possible and automating the learning process.
his handle is KingOfCoders - self-aggrandizing, insufferable, impotent in its attempts to be meta.
He thinks he's an artist because he now has the ability to curate a dataset based off of one artist's work and prompt more art generated in that style. He did it, so clearly he is an artist now.
Most artists are happy to see more people getting into art and joining the community. More artists means the skills of this culture get passed down to the next generation.
Obviously a billion dollar corporation using their work to create an industrial tool designed to displace them is very different.
Who cares? With AI, you don't need art school. AI is making humanities redundant, and people are too proud to admit it.
I can't believe how many people are not in awe of the possibilities of AI art, so it's great to see AI disturbing the cynics until they learn. Not everything is political, but I'll let them have this one.
Artists learning to innovate a trade defend their trade from incursion by bloodthirsty, no-value-adding vampiric middle men attempting to cut them out of the loop.
This is a tired argument; whether or not the diffusion models are "learning", they are a tool of capital to fuck over human artists, and should be resisted for that reason alone.
As a human artist I don't feel the same as you, and I somehow doubt that you care all that much about what we think anyways. You already made up your mind about the tech, so don't feel the need to protect us from "a tool of capital [sic]" to fortify your argument.
My opinion is based on my interactions with my friends who are artists. I admit freely to caring less about what people I don't know say, in the absence of additional evidence.
And horse wages (some oats) were low when the car was invented. Yet they were still inflated. There used to be more horses than humans in this country. Couldn't even earn their keep when the Ford Model T came along.
It's not surprising, they prefer machines to people and call humans "stochastic parrots". The more humans are compared to animals, the more justified they feel writing them off as an expense and doing away with them.
If you're independent selling paintings, sure. Designing packaging or something commercial? 4 hours of work a week for nearly 6 figures. I know a couple graphic designers and they don't do shit for what they're paid.
You should probably tell the other millions of artists busting out 60+ hour workweek in industry for half that price where these jobs are. That could solve this problem overnight.
In a cosmic sense, no. Most of us are slaving away at some job when we would rather be doing something else but are bound by the need to fund lodging.
But in a more practical sense yes. There's not much personal nor logistical progression to make pushing pencils. Meanwhile, art is a craft that you can dedicate your life to improving and educating others on. It can be a career instead of merely a job to do.
imagine being a photographer that takes decades to perfect their craft. sure another student can study and mimic your style. but it's still different than some computer model "ingesting" vast amount of photos and vomiting something similar for $5.99 in aws cpu cost so that some prompt jockey can call themselves an AI artist and make money off of other peoples talent.
i get that this is cynical and does not encompass all ai art, but why not let computers develop their own style wihout ingesting human art? that's when it would actually be AI art
Like 99.9% of the art the common people care about is Darth Vader and Taylor Swift and other pop culture stuff like that.
These people literally don’t care what your definition of what is and isn’t art is, or how it’s made, they just want a lock screen wallpaper of themselves fighting against Thanos on top of a volcano.
The argument of “what is art” has been an academic conversation largely ignored by the people actually consuming the art for hundreds of years. Photography was just pop culture trash, comics were pop culture trash, stick figure web comics were pop culture trash. Today’s pop culture trash is the “prompt jockey”.
I make probably 5-10 pictures every day over the course of maybe 20 minutes as jokes on Teams because we have Bing Chat Enterprise. My coworkers seem to enjoy it. Nobody cares that it’s generated. I’m also not trying to be an “artist” whatever that means. It just is, and it’s fun. I wasn’t gonna hire an artist to draw me pictures to shitpost to my coworkers. It’s instead unlocked a new fun way to communicate.
not entirely sure what your point is, but i think you are saying that art is just a commodity we use for cheap entertainment so it's ok for computers to do the same?
in the context of what i was saying the definition of what is art can be summed up as anything made by humans. i have no problem when its used in memes and being open sourced etc.. the issue i have is when a human invests real time into it and then its taken and regurgitated without their permission. do you see that distinction?
I mean, I don't think many care about your personal use of art. You can take copyright images and shit post and Disney won't go suing your workplace.
But many big players do want to use this commercially and that's where a lot of these lines start to form. No matter how lawsuits go you will probably still be able to find some LLM to make Thanos fighting a volcano. It's just a matter of how/if companies can profit from it.
That's a funny argument because artists lost their shit over photography too. Now anyone can make a portrait! Photography will kill art!
Art is the biggest gate kept industry there is and I detest artists who believe only they are the chosen one.
Art is human expression. We all have a right to create what we want with whatever tools we want. They can adapt or be left behind. No sympathy from me.
Because that's not what happens, ever. You wouldn't ask a human to have their style of photographing when they don't know what a photograph even looks like.
Exactly. Artists should drop the pretentious philosophical bumbling and accept what this is, a fight for their livelihood. Which is, in every sense, completely warranted and good.
Putting blame on the technology and trying to limit public access to software will not go anywhere. Your fight for regulation needs to be with publishers and producers, not with the teen trying to make a cool new wallpaper or the office-man trying to make an aesthetic powerpoint presentation.
> they are a tool of capital to fuck over human artists
So are the copyright and intellectual property laws that artists rely on. From my perspective, you are the capital and I am the one being fucked. So are you ready to abolish all that?
Copyright owners indeed. That's what these artists are. They're copyright owners. Monopolists. They are the capital. Capitalism is all about owning property. Copyright is intellectual property. Literally imaginary property. Ownership of information, of bits, of numbers. These artists are the literal epitome of capitalism. They enjoy state granted monopolies that last multiple human lifetimes. We'll be long dead before their works enter the public domain. They want it to be this way. They want eternal rent seeking for themselves and their descendants. At least one artist has told me exactly that in discussions here on HN. They think it's fair.
They are the quintessential representation of capital. And they come here to ask us to "resist" the other forms of capital on principle.
I'm sorry but... No. I'm gonna resist them instead. It's my sincere hope that this AI technology hammers in the last nails on the coffin of copyright and intellectual property as a whole. I want all the models to leak so that it becomes literally impossible to get rid of this technology no matter how much they hate it. I want it to progress so that we can run it on our own machines, so that it'll be so ubiquitous it can't be censored or banned no matter how much they lobby for it.
Right. IP is about balancing the harms done to the public versus the incentives given to the creators. That's why discussions about "creators want this and that" produce unbalanced objectives because... what about the public? IP is meant to benefit the public domain, it is not meant to protect creator interests.
I'm glad free software came about, where software owners realised that the power they had over their users were unjust. Hopefully the same can happen to other creative fields.
The original social contract was we'd all pretend we couldn't trivially copy and reproduce "their" works so they could make some money for a decade or so and then their works would enter the public domain.
When's the last time your culture entered the public domain?
I grew up watching films like Star Wars and Home Alone, playing Super Nintendo and PlayStation games. All these corporations have already made a million fortunes off of these things. When is all this stuff gonna become public property?
What about the public? They couldn't give less of a fuck about the public. They don't even care enough to fulfill their end of the bargain which was agreed upon when this copyright nonsense was created. Oh look, our property's about to become public? Better lobby the government and get them to extend the terms. Just pull the rug from under everyone, just move the goalposts, they won't even notice.
They use their fortunes to systematically rob us of our public domain rights. Yes, rights. We have rights to "their" works which they actively deny. They're literal robber barons. Therefore copyright infringement is a moral imperative. We should all stop pretending that copyright exists because the reality is it doesn't, it's completely made up and unenforceable and there's absolutely no reason anyone should recognize it as a legitimate law. Copyright infringement is civil disobedience and morally justified.
This AI stuff is exactly what we need. It's world changing technology. It's subversive. It gives me hope.
Claiming that a single artist of modest means whose work was used for model training explicitly against their wishes.... is exactly the same as the multibillion corporation doing the training and profiting off it at fleet-of-megayachts scale.... certainly is a take, I'll give you that. If you ever quote "first they came" in a self-pitying context, remember that you deserve it.
The issue of white collar workers finding it harder to support themselves is real and valid. The solution is unionisation, social support nets and potential UBI, that includes all workers.
But campaigning for stricter IP laws only benefits the copyright owners (both you and Disney), a subset of white collar workers, and is still immoral for all the reasons that copyright is immoral in the first place. That is what we're against.
Capitalism is not defined by the relative wealth of individuals. It's defined by ownership. Capitalists own property. Factories, buildings, companies, land, goods, stock, assets, copyrights, patents... All privately owned. If you're an owner, you're a capitalist. It's that simple. You're part of the ownership class. The common serfs? They don't own, they rent.
Claiming you're not a calitalist owner just because you're a poor starving artist, when the government quite literally grants you functionally infinite monopolies over everything you create whether you want it or not, is quite the take indeed. Not my fault artists are prone to selling off the rights to that monopoly to corporations for short term profit. It's like they don't even realize the power they have and the real capitalists are only too happy to relieve them of it.
As a representative of a lot of things but hardly any capital who uses diffusion models to get something I would otherwise not pay a human artist for anyway, I testify that, the models are not exclusively what you describe them to be.
I do not support indiscriminate banning of anything and everything that can potentially be used to fuck someone over.
I did not say they were exclusively that; I said they were that.
Once we as a society have implemented a good way for the artists whose work powers these machines to survive, you can feel good about using them. Until then, frankly, you're doing something immoral by paying to use them.
By this logic we ought to start lynching artists, why they didn't care about all of those who lost their jobs making pigments, canvasses, pencils, brushes etc etc
Artists pay those people and make their jobs needed. Same as the person above claiming Duchamp didn't negotiate with the ceramics makers - yes, they absolutely did and do pay their suppliers. Artists aren't smash and grabbing their local Blick.
It is! This method of doing so has overwhelming negative externalities, though. I'd expect anyone who actually gave a shit about AI empowering people to create to spend just as much effort pushing legislation so the displaced artists don't starve on the street as a result.
Isn't high-quality open image generation almost entirely dependent on Stability releasing their foundational models for free, at great expense to them?
That's not something you'll be able to rely on long-term, there won't always be a firehose of venture capital money to subsidise that kind of charity.
The cost of training them is going down, though. Given the existence of models like Pixart, I don’t think we’ll stay dependent on corporate charity for long.
That's a terrible analogy. Until the scrapers start deleting all other copies of what what they're scraping, "stealing" the art in a traditional sense, there's no harm done in the process of training the network. Any harm done comes after that.
That's an intentional misinterpretation, I think. I mention art as an economic activity because it's primarily professional artists that are harmed by the widespread adoption of this technology.
They tried to use the labor theory early on by claiming, "real art takes hard work and time as opposed to the miniscule cpu hours computers use to make 'AI art". The worst thing AI brings to the table is amplifying these types of sentiments to control industry in their favor where they would otherwise be unheard and relegated to Instagram likes
> It's funny that these people use the langauge of communism, but apparently see artwork as purley an economic activity.
You hit the nail on the head. Copyright is, by its very nature, a "tool of capital." It's a means of creating new artificial property fiefdoms for a select few capital holders to lord over, while taking rights from anyone else who wants to engage in the practice of making art.
Everyone has their right to expression infringed upon, all so the 1% of artists can perpetually make money on things, which are ultimately sold to corporations that only pay them pennies on the dollar anyway.
You, as an indie hip hop or house musician supported by a day job, can't sample and chop some vocals or use a slice of a chord played in a song (as were common in the 80s and 90s) for a completely new work, but apparently the world is such a better place because Taylor Swift is a multimillionaire and Disney can milk the maximum value from space and superhero films.
I'd rather live in a world where anyone is free to make whatever art they want, even if everyone has to have a day job.
What do you mean? Copyright protects all creative works, and all authors of those creative works. That some have greater means to enforce was always true, and copyright doesn’t cause that, it (imperfectly) helps mitigate it. What copyright does is actually prevent them from stealing work from independent artists en masse, and force them to at least hire and pay some artists.
> I’d rather live in a world where anyone is free to make whatever art they want, even if everyone has to have a day job.
You’re suggesting abolish Copyright and/or the Berne Convention? Yeah the problem with this thinking is that then the big publishers are completely free to steal everyone’s work without paying for it. The very thing you’re complaining about would only get way way worse if we allowed anyone to “freely” make whatever art they want by taking it from others. “Anyone” means Disney too, and Disney is more motivated than you.
> You, as an indie hip hop or house musician supported by a day job, can’t sample and chop some vocals or use a slice of a chord played in a song… for a completely new work
Hehe, if you sample, you are by definition not making a completely new work. But this is a terrible argument since sampling in music is widespread and has sometimes been successfully defended in court. DJs are the best example of independent artists who need protection you can think of?
> It's a means of creating new artificial property fiefdoms for a select few capital holders to lord over, while taking rights from anyone else who wants to engage in the practice of making art.
I doubt even Disney sue people who want to make fan art. But if you want to sell said art or distribute it, they will.
Human beings and LLMs are essentially equivalent, and their processes of "learning" are essentially equivalent, yet human artists are not affected by tools like Nightshade. Odd.
Sigh. Once again: I always love it when techbros say that AI learning and human learning are exactly the same, because reading one thing at a time at a biological pace and remembering takeaway ideas rather than verbatim passages is obviously exactly the same thing as processing millions of inputs at once and still being able to regurgitate sources so perfectly that verbatim copyrighted content can be spit out of an LLM that doesn't 'contain' its training material.
I'm just glad that so many more people have caught on to the bullshit than this time last year, or even six months ago.
I really don't even get the endgame. Art gets "democratized", so anyone who doesn't want their style copied stops putting stuff on the internet, and eventually all human art is trained, so the only new contributions are genAI. Maybe we could get a few centuries worth of stuff of "robot unicorn in the style of Artist X with a flair of Y" permutations, but even ignoring the centipede, that just sounds... boring. worthless.
Since techbros are stupid: "Note that people could always do these kinds of repurposing, and it was never a problem from a copyright perspective. We have a problem now because those things are being done (1) in an automated way (2) at a billionfold greater scale (3) by companies that have vastly more power in the market than artists, writers, publishers, etc. Incidentally, these three reasons are also why AI apologists are wrong when claiming that training image generators on art is just like artists taking inspiration from prior works."
A human artist does not need to look at and memorize 100000 pictures in any span of time, period. Current AI does.
We needed huge amounts of human labor to fund and build Versailles. I'm sure many died as a result. Now we have machines that save many of those lives and labor.
That perhaps we shoildnt compare modern capitalistic societies to 18th century European royalty? I sure don't sympathize with the justification of the ability to use less labor to feed the rich.