> As an example, when people use ChatGPT and get an image back, most don't think "oh, so the LLM called out to a diffusion API?" - they just think "oh chat GPT can give me an image if I give it a prompt".
Note: your first part skipped entirely the process of obtaining the data for and training of both of the above, which is a crucial part at least on par with what called which API.
I don’t think it’s unreasonable to expect people to build an intuition for it, though. It’s healthy when underlying processes are understood to at least a few layers of abstraction, especially in potentially problematic or morally grey areas.
As an analogy to your example, you could say that when people drink milk they usually don’t think “oh, so this cow was forced to reproduce 123 times with all her children taken away and murdered so that it makes more milk” and simply think “the cow gave this milk”.
However, like with milk, like with the ML tech, it is important to realize that 1) people do indeed learn the former and build the relevant intuitions, and 2) the industry is reliant on information asymmetry and mass ignorance of these matters (and we all know that information asymmetry is the #1 enemy of free market working as designed).
I disagree; people are not mindless consumers, people are interested in what comes from where, and given the knowledge most of them would make a choice that would be ethically good, when they can afford it.
What breaks this (and prevents the free market from working as intended) is lack of said knowledge, i.e., information asymmetry. In case of the milk example, I think it mostly comes to two factors:
1) Lack of this awareness is financially beneficial to respective industries. (No need for conspiracy theories, but they sure as hell not going to engage in educational campaigns on these topics, or facilitating any such efforts in any way, including not suing them. Further complicated by the fact that many these industries are integral parts of many local economies, which would make it in the interest of respective governments to follow suit.)
2) The facts can be so harsh that it can be difficult to internalise and accept reality.
Even still, many people do learn and internalise this—ever noticed the popularity of oat and almond milk in coffeeshops, even despite higher prices?—so I think it is not unreasonable to expect this in certain ML-based industries, either.
I think this seems to be veering off into some specific ethical viewpoint about milk supply chains and production rather than an analogy for product vs process.
But personally I can't see how the popularity of oat and almond milk in indepenent coffee shops tells us that much about how people perceive the inner workings of chatGPT.
There is a difference between descriptive and prescriptive statements. I don’t disagree that the status quo is that many people may not care, but I believe it is reasonable to expect them to care, just like they do in other domains (e.g., the milk analogy). The information asymmetry can (and maybe should) be fought back against.
Note: your first part skipped entirely the process of obtaining the data for and training of both of the above, which is a crucial part at least on par with what called which API.
I don’t think it’s unreasonable to expect people to build an intuition for it, though. It’s healthy when underlying processes are understood to at least a few layers of abstraction, especially in potentially problematic or morally grey areas.
As an analogy to your example, you could say that when people drink milk they usually don’t think “oh, so this cow was forced to reproduce 123 times with all her children taken away and murdered so that it makes more milk” and simply think “the cow gave this milk”.
However, like with milk, like with the ML tech, it is important to realize that 1) people do indeed learn the former and build the relevant intuitions, and 2) the industry is reliant on information asymmetry and mass ignorance of these matters (and we all know that information asymmetry is the #1 enemy of free market working as designed).