Fully on-board with #3. Change your attitude about what constitutes “a decent life”, such that pretty much all existence is positive-value, and moments of joy are worth much more than weeks of depression costs.
#1 and #2 are less obvious. One of the reasons it’s called the singularity is that EVERYTHING becomes hard to predict. A lot of people are assuming that the concepts of ownership and financial capital remain consistent enough that investments now retain their power after the big changes. I think they’re mostly wrong—if the economy shifts such that human intellectual labor becomes valueless (note: I’m a skeptic that this will happen in the next few generations. The style of work will change, but humans will provide the judgement and values behind intellectual work, even if a lot of the knowledge and “work” is AI.), the economy in general changes so much that there is no paper ownership that matters. Money is meaningless when it’s not circulating in a human-centric economy.
Largely agree with you that “EVERYTHING becomes hard to predict.”, it is partly what I meant to allude to with the introductory caveat in my comment. I imagine un-graspably transformative superintelligence well within our lifetime, and cannot give much more advice on that scenario, yet I still keep a non-zero probability on the world & socio-economic structures remaining—for whichever reasons—still more recognizable, for which case #1 and #2 seem still reasonably natural defaults. But yes, they may apply in a reasonably narrow band of imaginable AI transformed futures.
Fully on-board with #3. Change your attitude about what constitutes “a decent life”, such that pretty much all existence is positive-value, and moments of joy are worth much more than weeks of depression costs.
#1 and #2 are less obvious. One of the reasons it’s called the singularity is that EVERYTHING becomes hard to predict. A lot of people are assuming that the concepts of ownership and financial capital remain consistent enough that investments now retain their power after the big changes. I think they’re mostly wrong—if the economy shifts such that human intellectual labor becomes valueless (note: I’m a skeptic that this will happen in the next few generations. The style of work will change, but humans will provide the judgement and values behind intellectual work, even if a lot of the knowledge and “work” is AI.), the economy in general changes so much that there is no paper ownership that matters. Money is meaningless when it’s not circulating in a human-centric economy.
Largely agree with you that “EVERYTHING becomes hard to predict.”, it is partly what I meant to allude to with the introductory caveat in my comment. I imagine un-graspably transformative superintelligence well within our lifetime, and cannot give much more advice on that scenario, yet I still keep a non-zero probability on the world & socio-economic structures remaining—for whichever reasons—still more recognizable, for which case #1 and #2 seem still reasonably natural defaults. But yes, they may apply in a reasonably narrow band of imaginable AI transformed futures.