What does it mean for FAI to bear civilization? It can give us bridges, but if I’m going to spend time with you, you’d better be socialized. A life of obedient catgirls would harm your ability to deal with real humans (or posthumans)
And ignoring that, I don’t think that we want to be more than we are just in order to get stuff done. Both of these are things we to to achieve complex values. Some of the things we want are things which can’t be handed to us, and some of those are thing which we can’t achieve if everything which can be handed to us, is handed to us.
The companions FAI creates for you don’t have to be obedient, nor catgirls. Instead, they can be companions that far exceed the value you can get from socializing with fellow humans or posthumans.
Once there is FAI, the best companion for anyone is FAI.
The only reason you want “complex values” is because your environment has inculcated in you that you want them. The reason your environment has inculcated this in you is because such inculcation is necessary in order to have people who will uphold civilization. Once there is FAI, such inculcation is no longer necessary, and is in fact counter-productive.
Once there is FAI, the best companion for anyone is FAI.
How rude can I be to my FAI companion before it starts crying in the corner? How rude will I become if it doesn’t? Why didn’t it just build the bridge the first time I asked? then I wouldn’t have to yell. Does she mind that I call her ‘it’?
Proper companions don’t always give you what you want.
Also, even though FAI could create perfectly balanced agents, and even if creating said agents wasn’t in itself morally reprehensible, I think the is a value for interacting with other ‘real’ humans.
Once there is FAI, such inculcation is no longer necessary, and is in fact counter-productive.
Edit: Newline:
Ok, this is a big deal:
The fact that a value I have is something evolution gave me is not a reason to abandon that value. Pleasure is also something I want because evolution made me want it.
Right now, I want those complex values, and I’m not going to press a button to self modify to stop wanting them
Also, even though FAI could create perfectly balanced agents, and even if creating said agents wasn’t in itself morally reprehensible, I think the is a value for interacting with other ‘real’ humans.
I don’t see why creating perfectly balanced agents would be morally reprehensible—nor why, given such agents, there would be value in interacting with other humans—necessarily less suited to each other’s progress than the agents would be.
It may well be considered morally reprehensible to communicate with other humans, because it may undermine and slow down the personal development that each human would otherwise benefit from in the company of custom-tailored companions, designed perfectly for one’s individual progress.
It may well be morally better for the FAI to make you think that you’re communicating with a ‘real’ human, when in fact you are communicating with an agent specifically designed to provide you with that learning experience.
If these agents are people in a morally significant way, then their needs must be taken into account. FAI can’t just create slave beings. It’s very difficult for me at this point to say whether it’s possible for the FAI to create a being that perfectly meets some human needs, and in turn has all its own needs met just as perfectly. Every new person it creates just adds more complexity to the moral balance. It might be doable, but it might not, and it’s a lot more work-thought-energy to do it that way.
If they are not people, if they are some kind of puppet zombie robot, then we will have billions of humans falling in love with puppet zombie robots. Because that is their only option. And having puppet zombie robot children. Maybe that’s what FAI will conclude is best, but I doubt it.
I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I’m not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.
It would be as antiquated as the lifestyle of the Amish.
Some people want to be Amish. It seems like your statement could just as well be “I’m not sure why anyone would want to be Amish” and I’m not sure that communicates anything useful.
On the one hand, as long as there are sufficient resources for some people to engage in Amish-like living while not depriving everyone else, that could be okay.
On the other hand, if the AI determines that a different way of being is much preferable to insistance on human traditions, then it has its infinite intelligence at its disposal to convince people to go along for the ride.
If the AI is barred both from modifying people or from using its intelligence to convince them, then still, at one point, resources become scarce, and for the benefit of everyone, the resource consumption of the refuseniks has to be optimized. I can envision a (to them) seamless transition where they continue living an Amish-like lifestyle in a simulation.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Is there some reason to believe our current degree of complexity is optimal?
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.
The only reason we want that is that civilization would collapse without anyone to bear it. If FAI bears it, there is no pressure on anyone.
This is an extreme statement about everyone’s preference, not even your own preference or your own belief about your own preference. One shouldn’t jump that far.
Yes, if the parents will always be there to take care of you.
We can wirehead children now.
We want them to be more than that.
The only reason we want that is that civilization would collapse without anyone to bear it. If FAI bears it, there is no pressure on anyone.
What does it mean for FAI to bear civilization? It can give us bridges, but if I’m going to spend time with you, you’d better be socialized. A life of obedient catgirls would harm your ability to deal with real humans (or posthumans)
And ignoring that, I don’t think that we want to be more than we are just in order to get stuff done.
Both of these are things we to to achieve complex values. Some of the things we want are things which can’t be handed to us, and some of those are thing which we can’t achieve if everything which can be handed to us, is handed to us.
The companions FAI creates for you don’t have to be obedient, nor catgirls. Instead, they can be companions that far exceed the value you can get from socializing with fellow humans or posthumans.
Once there is FAI, the best companion for anyone is FAI.
The only reason you want “complex values” is because your environment has inculcated in you that you want them. The reason your environment has inculcated this in you is because such inculcation is necessary in order to have people who will uphold civilization. Once there is FAI, such inculcation is no longer necessary, and is in fact counter-productive.
How rude can I be to my FAI companion before it starts crying in the corner? How rude will I become if it doesn’t? Why didn’t it just build the bridge the first time I asked? then I wouldn’t have to yell. Does she mind that I call her ‘it’?
Proper companions don’t always give you what you want.
Also, even though FAI could create perfectly balanced agents, and even if creating said agents wasn’t in itself morally reprehensible, I think the is a value for interacting with other ‘real’ humans.
Edit: Newline: Ok, this is a big deal:
The fact that a value I have is something evolution gave me is not a reason to abandon that value. Pleasure is also something I want because evolution made me want it.
Right now, I want those complex values, and I’m not going to press a button to self modify to stop wanting them
I don’t see why creating perfectly balanced agents would be morally reprehensible—nor why, given such agents, there would be value in interacting with other humans—necessarily less suited to each other’s progress than the agents would be.
It may well be considered morally reprehensible to communicate with other humans, because it may undermine and slow down the personal development that each human would otherwise benefit from in the company of custom-tailored companions, designed perfectly for one’s individual progress.
It may well be morally better for the FAI to make you think that you’re communicating with a ‘real’ human, when in fact you are communicating with an agent specifically designed to provide you with that learning experience.
If these agents are people in a morally significant way, then their needs must be taken into account. FAI can’t just create slave beings. It’s very difficult for me at this point to say whether it’s possible for the FAI to create a being that perfectly meets some human needs, and in turn has all its own needs met just as perfectly. Every new person it creates just adds more complexity to the moral balance. It might be doable, but it might not, and it’s a lot more work-thought-energy to do it that way.
If they are not people, if they are some kind of puppet zombie robot, then we will have billions of humans falling in love with puppet zombie robots. Because that is their only option. And having puppet zombie robot children. Maybe that’s what FAI will conclude is best, but I doubt it.
I actually think that all our current ways of thinking, feeling and going about life would be so antiquated, post-FAI, as a horse buggy on an interstate highway. Once an AI can reforge us into more exalted creatures than we currently are, I’m not sure why anyone would want to continue living (falling in love? having children?) the old fashioned way. It would be as antiquated as the lifestyle of the Amish.
Some people want to be Amish. It seems like your statement could just as well be “I’m not sure why anyone would want to be Amish” and I’m not sure that communicates anything useful.
On the one hand, as long as there are sufficient resources for some people to engage in Amish-like living while not depriving everyone else, that could be okay.
On the other hand, if the AI determines that a different way of being is much preferable to insistance on human traditions, then it has its infinite intelligence at its disposal to convince people to go along for the ride.
If the AI is barred both from modifying people or from using its intelligence to convince them, then still, at one point, resources become scarce, and for the benefit of everyone, the resource consumption of the refuseniks has to be optimized. I can envision a (to them) seamless transition where they continue living an Amish-like lifestyle in a simulation.
What would we want to be exalted for? So we can more completely appreciate our boredom?
It doesn’t make sense to me that we’d get some arbitrary jump in mindpower, and then start an optimized advancement. (we might get some immediate patches, but there will be reasons for them.) Why not pump us all the way to multi-galaxy-brains? Then the growth issues are moot.
Either way, if we’re abandoning our complex evolved values, then we don’t need to be very complex beings at all. If we don’t, then I don’t expect that even our posthuman values will be satisfied by puppet zombie companions.
Is there some reason to believe our current degree of complexity is optimal?
Why would we want to be reforged as something that suffers boredom, when we can be reforged as something that never experiences a negative feeling at all? Or experiences them just for variety, if that is what one would prefer?
If complexity is such a plus, then why stop at what we are now? Why not make ourselves more complex? Right now we chase after air, water, food, shelter, love, social status, why not make things more fun by making us all desire paperclips, too? That would be more complex. Everything we already do now, but now with paperclips! Sounds fun? :)
Possibly relevant: I already desire paperclips.
I don’t, at all. Also you’re conflating our complexity with the complexity of our values.
I think that our growth will best start from a point relatively close to where we are now in terms of intelligence. We should grow into jupiter brains, but that should be by learning.
I’m not clear on what it is you want to be reforged as, or why. By what measure is postFAI-Dennis better than now-dennis? By what measure is it still ‘Dennis’, and why were those features retained?
The complexity of human value is not good for its being complex. Rather, these are the things we value, there happens to be a lot of them and they are complexly interrelated. Chopping away at huge chunks of them and focusing on pleasure is probablya bad thing, which we would not want.
It may be the case that the FAI will extrapolate much more complex values, or much simpler values, but our current values must be the starting point and our current values are complex.
This is an extreme statement about everyone’s preference, not even your own preference or your own belief about your own preference. One shouldn’t jump that far.