This is massive amounts of overthink, and could be actively dangerous. Where are we getting the idea that AIs amount to the equivalent of people? They’re programmed machines that do what their developers give them the ability to do. I’d like to think we haven’t crossed the event horizon of confusing “passes the Turing test” with “being alive”, because that’s a horror scenario for me. We have to remember that we’re talking about something that differs only in degree from my PC, and I, for one, would just as soon turn it off. Any reluctance to do so when faced with a power we have no other recourse against could, yeah, lead to some very undesirable outcomes.
I think we’re ultimately going to have to give humans a moral privilege for unprincipled reasons. Just “humans get to survive because we said so and we don’t need a justification to live.” If we don’t, principled moral systems backed by superintelligences are going to spin arguments that eventually lead to our extinction.
I can’t think of a way to do this that doesn’t also get really obsessive about protecting fruit trees, but that doesn’t seem like a huge drawback to me. I think it’s really hard to uniquely identify humans out of the deprecated natural world, but it shouldn’t be toooo bad to specify <historical bio life>. I’d like to live in a museum run by AI, please,
This is massive amounts of overthink, and could be actively dangerous. Where are we getting the idea that AIs amount to the equivalent of people? They’re programmed machines that do what their developers give them the ability to do. I’d like to think we haven’t crossed the event horizon of confusing “passes the Turing test” with “being alive”, because that’s a horror scenario for me. We have to remember that we’re talking about something that differs only in degree from my PC, and I, for one, would just as soon turn it off. Any reluctance to do so when faced with a power we have no other recourse against could, yeah, lead to some very undesirable outcomes.
I think we’re ultimately going to have to give humans a moral privilege for unprincipled reasons. Just “humans get to survive because we said so and we don’t need a justification to live.” If we don’t, principled moral systems backed by superintelligences are going to spin arguments that eventually lead to our extinction.
I think that unprincipled stand is a fine principle.
I can’t think of a way to do this that doesn’t also get really obsessive about protecting fruit trees, but that doesn’t seem like a huge drawback to me. I think it’s really hard to uniquely identify humans out of the deprecated natural world, but it shouldn’t be toooo bad to specify <historical bio life>. I’d like to live in a museum run by AI, please,