I like the angle you’ve explored. Humans are allowed to care about humans — and propagate that caring beyond its most direct implications. We’re allowed to care not only about humans’ survival, but also about human art and human communication and so on.
But I think another angle is also relevant: there are just cooperative and non-cooperative ways to create art (or any other output). If AI creates art in non-cooperative ways, it doesn’t matter how the algorithm works or if it’s sentient or not.
It’s a fair angle in principle; if for example two artists agreed to create N works and train AI on the whole set in order to produce “hybrid” art that mixes their styles, that would be entirely legitimate algorithmic art and I doubt anyone will take issue with it! The problem now is also specifically that N needs to be inordinately large. A model that can create art with few shot learning would make questions of copyright much easier to solve. It’s the fact that in practice the only realistic way right now is to have millions of dollars in compute and use a tagged training set bigger than just public domain material which puts AI and artists inevitably on a collision course.
Maybe I’ve misunderstood your reply, but I wanted to say that hypothetically even humans can produce art in non-cooperative and disruptive ways, without breaking existing laws.
Imagine a silly hypothetical: one of the best human artists gets a time machine and starts offering their art for free. That artist functions like an image generator. Is such an artist doing something morally questionable? I would say yes.
If they significantly undercut the competition by using some trick I would agree they are, though it’s a grey area mostly (what if instead of a time machine they just have a bunch of inherited money that allows them to work without worrying about making a living? Can’t people release their work for free?).
I think we can just judge by the consequences (here “consequences” don’t have to refer to utility calculus). If some way of “injecting” art into culture is too disruptive, we can decide to not allow it. Doesn’t matter who or how makes the injection.
I like the angle you’ve explored. Humans are allowed to care about humans — and propagate that caring beyond its most direct implications. We’re allowed to care not only about humans’ survival, but also about human art and human communication and so on.
But I think another angle is also relevant: there are just cooperative and non-cooperative ways to create art (or any other output). If AI creates art in non-cooperative ways, it doesn’t matter how the algorithm works or if it’s sentient or not.
It’s a fair angle in principle; if for example two artists agreed to create N works and train AI on the whole set in order to produce “hybrid” art that mixes their styles, that would be entirely legitimate algorithmic art and I doubt anyone will take issue with it! The problem now is also specifically that N needs to be inordinately large. A model that can create art with few shot learning would make questions of copyright much easier to solve. It’s the fact that in practice the only realistic way right now is to have millions of dollars in compute and use a tagged training set bigger than just public domain material which puts AI and artists inevitably on a collision course.
Maybe I’ve misunderstood your reply, but I wanted to say that hypothetically even humans can produce art in non-cooperative and disruptive ways, without breaking existing laws.
Imagine a silly hypothetical: one of the best human artists gets a time machine and starts offering their art for free. That artist functions like an image generator. Is such an artist doing something morally questionable? I would say yes.
If they significantly undercut the competition by using some trick I would agree they are, though it’s a grey area mostly (what if instead of a time machine they just have a bunch of inherited money that allows them to work without worrying about making a living? Can’t people release their work for free?).
I think we can just judge by the consequences (here “consequences” don’t have to refer to utility calculus). If some way of “injecting” art into culture is too disruptive, we can decide to not allow it. Doesn’t matter who or how makes the injection.