What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Is there some reason to believe our current degree of complexity is optimal?
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Possibly relevant: I already desire paperclips.
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.