Thank you for taking the time to write out such a thoughtful reply. I will be taking the time to return the favor shortly.
EDIT: Here’s the (long) reply:
If anyone but a FAI is running this utopia
What does it mean to run a utopia? In order to run something, one must make decisions. What sort of decisions would this person or FAI be making? I realize it’s hard to predict what exact scenario it would be like, but we can speculate all we like (and then see which one’s sound most realistic/probable). Also, who said that anyONE would be “running” this virtual reality? It could be democratic.
Also, who said that utopias have to be with other real people/uploads rather than with non-conscious programs (like in a video game or a dream)? I can understand people’s desire to be in “reality” with “real people”. But this wouldn’t be necessary for video-game type virtual reality.
Now, if the timeline this “solution” is supposed to work approaches near-infinity, then at some point collapse due to human error seems inevitable
I think it is probably less inevitable than the heat-death of the universe. I think that the transition to hardware would permit people to survive on substrates that could exist in space with no need to inhabit a planet. There is no longer a need for food, only electricity (which would be easily available in the form of solar energy). Spreading this substrate in every direction reduce the risk of the collapse of “society”.
Why mandatory brain-rewiring before uploading a human mind into a virtual reality run by human-like agents would be grotesque I won’t elaborate further.
It won’t necessarily be mandatory to rewire oneself to be smarter, kinder, more emotionally stable. But, if one had the petty desires for power, sex, and status (as you claim), then they would willingly choose to rewire themselves (or risk being left behind).
We humans are self-obsessed shits and bolting on a rationality module won’t change that. It would make us better at getting the things we want, but the things humans seem to want most of the time (judged by our current actions) are not working utopias but things typical for self-obsessed evolved agents: Power, social status and sex. Could this change by virtue of becoming more rational and intelligent? There’s a good chance it would
Today, power is measured in terms of wealth or influence. Wealth seems like it would cease to be a relevant factor as economics is dependent on scarcity (have you ever had to buy air?) and in an age in which everything is digital, the only limitation is computational capacity.
Although this is hardly certain, I hypothesize that (“actual”) sex would cease to be a relevant motivator of uploads. Sex in a virtual reality would be free, clean, and offer the user the ability to simulate situations that wouldn’t be available to them in real life.
Status is usually sought today in order to have sex (see above) and by means of acquiring wealth (see above).
Personally, I believe that once we become uploads, the chemical imbalances and irrational beliefs that drive our behavior (for evolutionary purposes) will dissipate and we will be infinitely happier than we have ever been.
Powerful gods with the petty goals of humans.
Agreed that it is frightening. Nice way of putting it.
So risk-wise we would be worse off than trying to build a FAI from scratch, which would be more competent at solving our problems than humans could ever be (whether intelligence-enhanced or not).
This is the key question. What is riskier? I acknowledge that P(utopia|FAI+WBE) > P (utopia|WBE). But, I don’t acknowledge that P(utopia|AGI+WBE) > P (utopia|WBE).
Also, I realize that a virtual reality is a great way to absolve humanity of resource scarcity and restricted bodily abilities/requirements, but virtually all other problems (moral, social etc.) are left rather untouched by the virtual reality solution
I believe these problems are caused by scarcity problems (scarcity of intelligence, money, access to quality education). And as I’ve pointed out earlier, I think that seeking sex, power, and status will disappear.
A virtual reality is not automatically a utopia. In a way it would be like living in a world where everyone is immortal and filthy rich—everyone has huge power over material (or as in our case virtual) “things”, yet this doesn’t solve all your social or personal problems.
Personal problems are caused by unhappiness, craving, addiction, etc. These can all be traced back to brain states. These brain states could be “fixed” (voluntarily) by altering the digital settings of the digital neurochemical levels. (though I believe that we will have a much better idea of how to alter the brain than simply altering chemical levels. The current paradigm in neuroscience has a hammer (drugs) and so it tends to look at all the problems as nails).
Thank you for taking the time to write out such a thoughtful reply. I will be taking the time to return the favor shortly.
EDIT: Here’s the (long) reply:
What does it mean to run a utopia? In order to run something, one must make decisions. What sort of decisions would this person or FAI be making? I realize it’s hard to predict what exact scenario it would be like, but we can speculate all we like (and then see which one’s sound most realistic/probable). Also, who said that anyONE would be “running” this virtual reality? It could be democratic. Also, who said that utopias have to be with other real people/uploads rather than with non-conscious programs (like in a video game or a dream)? I can understand people’s desire to be in “reality” with “real people”. But this wouldn’t be necessary for video-game type virtual reality.
I think it is probably less inevitable than the heat-death of the universe. I think that the transition to hardware would permit people to survive on substrates that could exist in space with no need to inhabit a planet. There is no longer a need for food, only electricity (which would be easily available in the form of solar energy). Spreading this substrate in every direction reduce the risk of the collapse of “society”.
It won’t necessarily be mandatory to rewire oneself to be smarter, kinder, more emotionally stable. But, if one had the petty desires for power, sex, and status (as you claim), then they would willingly choose to rewire themselves (or risk being left behind).
Today, power is measured in terms of wealth or influence. Wealth seems like it would cease to be a relevant factor as economics is dependent on scarcity (have you ever had to buy air?) and in an age in which everything is digital, the only limitation is computational capacity.
Although this is hardly certain, I hypothesize that (“actual”) sex would cease to be a relevant motivator of uploads. Sex in a virtual reality would be free, clean, and offer the user the ability to simulate situations that wouldn’t be available to them in real life.
Status is usually sought today in order to have sex (see above) and by means of acquiring wealth (see above).
Personally, I believe that once we become uploads, the chemical imbalances and irrational beliefs that drive our behavior (for evolutionary purposes) will dissipate and we will be infinitely happier than we have ever been.
Agreed that it is frightening. Nice way of putting it.
This is the key question. What is riskier? I acknowledge that P(utopia|FAI+WBE) > P (utopia|WBE). But, I don’t acknowledge that P(utopia|AGI+WBE) > P (utopia|WBE).
I believe these problems are caused by scarcity problems (scarcity of intelligence, money, access to quality education). And as I’ve pointed out earlier, I think that seeking sex, power, and status will disappear.
Personal problems are caused by unhappiness, craving, addiction, etc. These can all be traced back to brain states. These brain states could be “fixed” (voluntarily) by altering the digital settings of the digital neurochemical levels. (though I believe that we will have a much better idea of how to alter the brain than simply altering chemical levels. The current paradigm in neuroscience has a hammer (drugs) and so it tends to look at all the problems as nails).