I’ve seen this machine/human analogy made before, and I don’t understand why it goes through. I think people over-index on the fact that the “learning” terminology is so common. If the field of ML were instead “automatic encoding” I don’t think it would change the IP issues.
I think the argument fails for two reasons:
I assume we are operating in some type of intellectual property framework. Otherwise whats the issue? Artists don’t have a free-stranding right to high demand for their work. The argument has to be that they have ownership rights which were violated. But in that case, the human/machine distinction makes complete sense. If you own a work, you can give permission for certain people/uses but not others (like only giving permission for people who pay you to use the work). Thus, artists may argue, however it was we made our works available, it was clear/reasonable that we were making it available for people but not for use in training AI systems. If developers had a license to use the works for training then of course there would be no issue.
We could reverse the analogy. Let’s say I go watch a play. The performers have the right to perform the work, but I haven’t secured any rights to do something like copy the script. As I watch, surely I will remember some parts of the play. Have I “copied” the work within the meaning of IP laws? I think we can reject this idea just on a fundamental human freedom argument. Even if the neurons in my brain contain a copy of a work that I don’t have the rights for, it doesn’t matter. There is a human/machine difference because, below a certain threshold of machine capabilities, we probably believe humans have these types of rights while machines don’t. If we get to a place where we begin to think machines do have such rights, then the argument does work (perhaps with some added non-discrimination against AIs idea to answer my #1).
At the same time though I don’t think I personally feel a strong obligation not to use AI art just because I don’t feel a strong obligation to strongly respect IP rights in general. On a policy level I think they have to exist, but lets say I’m listening to a cover of a song and I find out that actually the cover artist doesn’t have the appropriate rights secured. I’m not gonna be broken up about it.
A different consideration though is what a movement that wants to potentially be part of a coalition with people who are more concerned about AI art should do. A tough question in my view.
I’ve seen this machine/human analogy made before, and I don’t understand why it goes through. I think people over-index on the fact that the “learning” terminology is so common. If the field of ML were instead “automatic encoding” I don’t think it would change the IP issues.
I think the argument fails for two reasons:
I assume we are operating in some type of intellectual property framework. Otherwise whats the issue? Artists don’t have a free-stranding right to high demand for their work. The argument has to be that they have ownership rights which were violated. But in that case, the human/machine distinction makes complete sense. If you own a work, you can give permission for certain people/uses but not others (like only giving permission for people who pay you to use the work). Thus, artists may argue, however it was we made our works available, it was clear/reasonable that we were making it available for people but not for use in training AI systems. If developers had a license to use the works for training then of course there would be no issue.
We could reverse the analogy. Let’s say I go watch a play. The performers have the right to perform the work, but I haven’t secured any rights to do something like copy the script. As I watch, surely I will remember some parts of the play. Have I “copied” the work within the meaning of IP laws? I think we can reject this idea just on a fundamental human freedom argument. Even if the neurons in my brain contain a copy of a work that I don’t have the rights for, it doesn’t matter. There is a human/machine difference because, below a certain threshold of machine capabilities, we probably believe humans have these types of rights while machines don’t. If we get to a place where we begin to think machines do have such rights, then the argument does work (perhaps with some added non-discrimination against AIs idea to answer my #1).
At the same time though I don’t think I personally feel a strong obligation not to use AI art just because I don’t feel a strong obligation to strongly respect IP rights in general. On a policy level I think they have to exist, but lets say I’m listening to a cover of a song and I find out that actually the cover artist doesn’t have the appropriate rights secured. I’m not gonna be broken up about it.
A different consideration though is what a movement that wants to potentially be part of a coalition with people who are more concerned about AI art should do. A tough question in my view.