I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I’m not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.
It would be as antiquated as the lifestyle of the Amish.
Some people want to be Amish. It seems like your statement could just as well be “I’m not sure why anyone would want to be Amish” and I’m not sure that communicates anything useful.
On the one hand, as long as there are sufficient resources for some people to engage in Amish-like living while not depriving everyone else, that could be okay.
On the other hand, if the AI determines that a different way of being is much preferable to insistance on human traditions, then it has its infinite intelligence at its disposal to convince people to go along for the ride.
If the AI is barred both from modifying people or from using its intelligence to convince them, then still, at one point, resources become scarce, and for the benefit of everyone, the resource consumption of the refuseniks has to be optimized. I can envision a (to them) seamless transition where they continue living an Amish-like lifestyle in a simulation.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Is there some reason to believe our current degree of complexity is optimal?
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.
I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I’m not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.
Some people want to be Amish. It seems like your statement could just as well be “I’m not sure why anyone would want to be Amish” and I’m not sure that communicates anything useful.
On the one hand, as long as there are sufficient resources for some people to engage in Amish-like living while not depriving everyone else, that could be okay.
On the other hand, if the AI determines that a different way of being is much preferable to insistance on human traditions, then it has its infinite intelligence at its disposal to convince people to go along for the ride.
If the AI is barred both from modifying people or from using its intelligence to convince them, then still, at one point, resources become scarce, and for the benefit of everyone, the resource consumption of the refuseniks has to be optimized. I can envision a (to them) seamless transition where they continue living an Amish-like lifestyle in a simulation.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Possibly relevant: I already desire paperclips.
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.