Unlike you, I am also unconvinced it would cease to value unaugmented humans, or infants. Similarly, I am unconvinced that it would continue to value its own existence, or, well, anything at all.
Even if you don’t buy my arguments, given the nearly infinite search space of things that it could end up valuing, what would its probability of valuing any one specific thing like “unaugmented humans” end up being ?
But I don’t think we can actually sustain this discussion with an answer to that question at any level more detailed than a handwavy notion of “vastly augmented” and analogies to insects and protozoa, so I’m content to posit either that it does, or that it doesn’t, whichever suits you.
Fair enough, though we could probably obtain some clues by surveying the incredibly smart—though merely human—geniuses that do exist in our current world, and extrapolating from there.
My own intuition, FWIW, is that some such minds will remember their true origins...
It depends on what you mean by “remember”, I suppose. Technically, it is reasonably likely that such minds would be able to access at least some of their previously accumulated experiences in some form (they could read the blog posts of their past selves, if push comes to shove), but it’s unclear what value they would put on such data, if any.
You keep talking like this, as though these kinds of value judgments were objective, or at least reliably intersubjective. It’s not at all clear to me why.
Maybe it’s just me, but I don’t think that my own, personal memories of my own, personal infancy would differ greatly from anyone else’s—though, not being a biologist, I could be wrong about that. I’m sure that some infants experienced environments with different levels of illumination and temperature; some experienced different levels of hunger or tactile stimuli, etc. However, the amount of information that an infant can receive and process is small enough so that the sum total of his experiences would be far from unique. Once you’ve seen one poorly-resolved bright blob, you’ve seen them all.
By analogy, I ate a banana for breakfast yesterday, but I don’t feel anything special about it. It was a regular banana from the store; once you’ve seen one, you’ve seen them all, plus or minus some minor, easily comprehensible details like degree of ripeness (though, of course, I might think differently if I was a botanist).
IMO it is likely that an augmented mind might think the same way about ordinary humans. Once you’ve seen one human, you’ve seen them all, plus or minus some minor details...
what would its probability of valuing any one specific thing like “unaugmented humans” end up being ?
Vanishingly small, obviously, if we posit that its pre-existing value system is effectively uncorrelated with its post-augment value system, which it might well be. Hence my earlier claim that I am unconvinced that a “sufficiently augmented” human would continue to value unaugmented humans. (You seem to expect me to disagree with this, which puzzles me greatly, since I just said the same thing myself; I suspect we’re simply not understanding one another.)
we could probably obtain some clues by surveying the incredibly smart—though merely human—geniuses that do exist in our current world, and extrapolating from there.
Sure, we could do that, which would give us an implicit notion of “vastly augmented intelligence” as something like naturally occurring geniuses (except on a much larger scale). I don’t think that’s terribly likely, but as I say, I’m happy to posit it for discussion if you like.
it’s unclear what value they would put on such data, if any. [...] I don’t think that my own, personal memories of my own, personal infancy would differ greatly from anyone else’s [...] IMO it is likely that an augmented mind might think the same way about ordinary humans. Once you’ve seen one human, you’ve seen them all, plus or minus some minor details...
I agree that it’s unclear.
To say that more precisely, an augmented mind would likely not value its own memories (relative to some roughly identical other memories), or any particular ordinary human, any more than an adult human values its own childhood blanket rather than some identical blanket, or values one particular and easily replaceable goldfish.
The thing is, some adult humans do value their childhood blankets, or one particular goldfish.
You seem to expect me to disagree with this, which puzzles me greatly, since I just said the same thing myself; I suspect we’re simply not understanding one another.
That’s correct; for some reason, I was thinking that you believed that a human’s preference for the well-being his (formerly) fellow humans is likely to persist after augmentation. Thus, I did misunderstand your position; my apologies.
The thing is, some adult humans do value their childhood blankets, or one particular goldfish.
I think that childhood blankets and goldfish are different from an infant’s memories, but perhaps this is a topic for another time...
Even if you don’t buy my arguments, given the nearly infinite search space of things that it could end up valuing, what would its probability of valuing any one specific thing like “unaugmented humans” end up being ?
Fair enough, though we could probably obtain some clues by surveying the incredibly smart—though merely human—geniuses that do exist in our current world, and extrapolating from there.
It depends on what you mean by “remember”, I suppose. Technically, it is reasonably likely that such minds would be able to access at least some of their previously accumulated experiences in some form (they could read the blog posts of their past selves, if push comes to shove), but it’s unclear what value they would put on such data, if any.
Maybe it’s just me, but I don’t think that my own, personal memories of my own, personal infancy would differ greatly from anyone else’s—though, not being a biologist, I could be wrong about that. I’m sure that some infants experienced environments with different levels of illumination and temperature; some experienced different levels of hunger or tactile stimuli, etc. However, the amount of information that an infant can receive and process is small enough so that the sum total of his experiences would be far from unique. Once you’ve seen one poorly-resolved bright blob, you’ve seen them all.
By analogy, I ate a banana for breakfast yesterday, but I don’t feel anything special about it. It was a regular banana from the store; once you’ve seen one, you’ve seen them all, plus or minus some minor, easily comprehensible details like degree of ripeness (though, of course, I might think differently if I was a botanist).
IMO it is likely that an augmented mind might think the same way about ordinary humans. Once you’ve seen one human, you’ve seen them all, plus or minus some minor details...
Vanishingly small, obviously, if we posit that its pre-existing value system is effectively uncorrelated with its post-augment value system, which it might well be. Hence my earlier claim that I am unconvinced that a “sufficiently augmented” human would continue to value unaugmented humans. (You seem to expect me to disagree with this, which puzzles me greatly, since I just said the same thing myself; I suspect we’re simply not understanding one another.)
Sure, we could do that, which would give us an implicit notion of “vastly augmented intelligence” as something like naturally occurring geniuses (except on a much larger scale). I don’t think that’s terribly likely, but as I say, I’m happy to posit it for discussion if you like.
I agree that it’s unclear.
To say that more precisely, an augmented mind would likely not value its own memories (relative to some roughly identical other memories), or any particular ordinary human, any more than an adult human values its own childhood blanket rather than some identical blanket, or values one particular and easily replaceable goldfish.
The thing is, some adult humans do value their childhood blankets, or one particular goldfish.
And others don’t.
That’s correct; for some reason, I was thinking that you believed that a human’s preference for the well-being his (formerly) fellow humans is likely to persist after augmentation. Thus, I did misunderstand your position; my apologies.
I think that childhood blankets and goldfish are different from an infant’s memories, but perhaps this is a topic for another time...
I’m not quite sure what other time you have in mind, but I’m happy to drop the subject. If you want to pick it up some other time feel free.