I mean, is the implication that this would instead be good if phenomenological consciousness did come with intelligence?
This was just an arbitrary example to demonstrate the more general idea that it’s possible we could make the wrong assumption about what makes humans valuable. Even if we discover that consciousness comes with intelligence, maybe there’s something else entirely that we’re missing which is necessary for a being to be morally valuable.
I don’t want “humanism” to be taken too strictly, but I honestly think that anything that is worth passing the torch to wouldn’t require us passing any torch at all and could just coexist with us…
I agree with this sentiment! Even though I’m open to the possibility of non-humans populating the universe instead of humans, I think it’s a better strategy for both practical and moral uncertainty reasons to make the transition peacefully and voluntarily.
maybe there’s something else entirely that we’re missing which is necessary for a being to be morally valuable.
You’re talking about this as if it was a matter of science and discovery. I’m not a moral realist so to me that doesn’t compute. We don’t discover what constitutes moral worth; we decide it. The only discovery involved here may be self-discovery. We could have moral instincts and then introspect to figure out more straightforwardly what do they map to precisely. But deciding to follow our moral instincts at all is as arbitrary a call as any other.
I’m open to the possibility of non-humans populating the universe instead of humans
As I said, only situation in which this would be true for me is IMO if either humans voluntarily just stop having children (e.g. they see the artificial beings as having happier lives and thus would rather raise one of them than an organic child) or conditions get so harsh that it’s impossible for organic beings to keep existing and artificial ones are the only hope (e.g. Earth about to get wiped out by the expanding Sun, we don’t have enough energy to send away a working colony ship with a self-sustaining population but we CAN send small and light Von Neumann interstellar probes full of AIs of the sort we deeply care about).
This was just an arbitrary example to demonstrate the more general idea that it’s possible we could make the wrong assumption about what makes humans valuable. Even if we discover that consciousness comes with intelligence, maybe there’s something else entirely that we’re missing which is necessary for a being to be morally valuable.
I agree with this sentiment! Even though I’m open to the possibility of non-humans populating the universe instead of humans, I think it’s a better strategy for both practical and moral uncertainty reasons to make the transition peacefully and voluntarily.
You’re talking about this as if it was a matter of science and discovery. I’m not a moral realist so to me that doesn’t compute. We don’t discover what constitutes moral worth; we decide it. The only discovery involved here may be self-discovery. We could have moral instincts and then introspect to figure out more straightforwardly what do they map to precisely. But deciding to follow our moral instincts at all is as arbitrary a call as any other.
As I said, only situation in which this would be true for me is IMO if either humans voluntarily just stop having children (e.g. they see the artificial beings as having happier lives and thus would rather raise one of them than an organic child) or conditions get so harsh that it’s impossible for organic beings to keep existing and artificial ones are the only hope (e.g. Earth about to get wiped out by the expanding Sun, we don’t have enough energy to send away a working colony ship with a self-sustaining population but we CAN send small and light Von Neumann interstellar probes full of AIs of the sort we deeply care about).