I’d favor the way Kurzweil portrays a technological Singularity here, where humans themselves become the Gods.
The problem with having a pantheon of Gods… they tend to bicker. With metaphorical lightening bolts. ;)
I don’t that outcome would be incompatible with a FAI (which may be necessary to do the research to get you your godlike powers). Apart from the initial enabling the FAI would provide the new ‘Gods’ could choose by mutual agreement to create some form of power structure that prevented them from messing each other over and burning the cosmic commons in competition.
So I still hope that AI going foom is wrong and that we see a slow development over many centuries instead, without any singularity type event.
You talked about the downside to mere observation. That would be utterly trivial and benign compared to the effects of Malthusian competition. Humans are not in a stable equilibrium now. We rely on intuitions created in a different time and different circumstances to prevent us from rapidly rushing to a miserable equilibrium of subsistence living.
The longer we go before putting a check on evolutionary pressure towards maximum securing of resources the more we will lose that which we value as ‘human’. Yes everything we value except existence itself. Even consciousness in the form that we experience it.
The longer we go before putting a check on evolutionary pressure towards maximum securing of resources the more we will lose that which we value as ‘human’. Yes everything we value except existence itself. Even consciousness in the form that we experience it.
I don’t think I emphasised this enough. Unless the ultimate cooperation problem is solved we will devolve to something that is less human than Clippy. Clippy at least has a goal that he seeks to maximise and which motivates his quest for power. Competition would weed out even that much personality.
The problem with having a pantheon of Gods… they tend to bicker. With metaphorical lightening bolts. ;)
I don’t that outcome would be incompatible with a FAI (which may be necessary to do the research to get you your godlike powers). Apart from the initial enabling the FAI would provide the new ‘Gods’ could choose by mutual agreement to create some form of power structure that prevented them from messing each other over and burning the cosmic commons in competition.
You talked about the downside to mere observation. That would be utterly trivial and benign compared to the effects of Malthusian competition. Humans are not in a stable equilibrium now. We rely on intuitions created in a different time and different circumstances to prevent us from rapidly rushing to a miserable equilibrium of subsistence living.
The longer we go before putting a check on evolutionary pressure towards maximum securing of resources the more we will lose that which we value as ‘human’. Yes everything we value except existence itself. Even consciousness in the form that we experience it.
I don’t think I emphasised this enough. Unless the ultimate cooperation problem is solved we will devolve to something that is less human than Clippy. Clippy at least has a goal that he seeks to maximise and which motivates his quest for power. Competition would weed out even that much personality.