Outside view of your 1 2, 3 and 4: most people end up in trajectory number 4
I’m not sure how to word this well, but “most people” haven’t lived in the year 2080. Because of a) the possibility of a singularity and b) the law of accelerating returns, the “most people” that you refer to doesn’t seem like an appropriate reference class to me.
In particular let’s look at 1
I think you’re downplaying the chances that a singularity does happen in my lifetime. 90% of experts seem to think it will.
Interesting point abut 200 years ago though. I’m having a hard time imagining the standard of living for the poor in that time period being insufficient for me, but I’ve noted this as something to examine further.
I think you’re downplaying the chances that a singularity does happen in my lifetime. 90% of experts seem to think it will.
The experts are biased.
Consider two competent AI researchers, Alice and Bob. They both investigate the possibility of a singularity. Alice comes to the conclusion that it might happen in a few centuries or never. Bob comes to the conclusion that it will be possible in a few decades. What happens next?
Alice isn’t interested in the singularity any more and goes off to work at, I don’t now, image recognition. Bob as a consequence of his views still is interested in the singularity and continues to work on it. At this point Alice is not an “expert” on singularity, but Bob is. A survey would ask for Bob’s opinion but will not ask Alice what she thinks.
I think you’re downplaying the chances that a singularity does happen in my lifetime. 90% of experts seem to think it will.
I don’t. (Edit: I meant this as “I don’t think I am downplaying the chances”, not “I don’t think the singularity will happen”)
It’s true that I disagree with your experts here, and Lumifer speaks to some of my reasons. I even disagree with the LW consensus which is much more conservative than the one you quote.
That said, even taking your predictions for granted, there are still two huge concerns with the singularity retirement plan:
Even given that it will occur in your/my lifetime, how do you know what it will look like and that it will lead to a retirement you are happy with even if you have no capital?
If there is even a 5-10% chance that it doesn’t happen, or doesn’t provide what you want—that is a fail when I am doing a retirement plan for most of my clients. I’m generally aiming for a 0+epsilon or at least <1% chance of failure if the client is able to follow the plan[*]. The only clients where building in a 10% chance of bust is ok are those who are in a real pickle, and there is no reasonable strategy to do better. Those clients’ plans have to include downward adjustment of their goals if the initial trajectory is in or too close to the failure window.
[*] obviously most of the true failure chance happens when the client is unable to follow the plan at some point. Financially, some of that can be insured against (health and disability, life for dependent survivors) and some can’t.
I’m not sure how to word this well, but “most people” haven’t lived in the year 2080. Because of a) the possibility of a singularity and b) the law of accelerating returns, the “most people” that you refer to doesn’t seem like an appropriate reference class to me.
I think you’re downplaying the chances that a singularity does happen in my lifetime. 90% of experts seem to think it will.
Interesting point abut 200 years ago though. I’m having a hard time imagining the standard of living for the poor in that time period being insufficient for me, but I’ve noted this as something to examine further.
The experts are biased.
Consider two competent AI researchers, Alice and Bob. They both investigate the possibility of a singularity. Alice comes to the conclusion that it might happen in a few centuries or never. Bob comes to the conclusion that it will be possible in a few decades. What happens next?
Alice isn’t interested in the singularity any more and goes off to work at, I don’t now, image recognition. Bob as a consequence of his views still is interested in the singularity and continues to work on it. At this point Alice is not an “expert” on singularity, but Bob is. A survey would ask for Bob’s opinion but will not ask Alice what she thinks.
I don’t. (Edit: I meant this as “I don’t think I am downplaying the chances”, not “I don’t think the singularity will happen”)
It’s true that I disagree with your experts here, and Lumifer speaks to some of my reasons. I even disagree with the LW consensus which is much more conservative than the one you quote.
That said, even taking your predictions for granted, there are still two huge concerns with the singularity retirement plan:
Even given that it will occur in your/my lifetime, how do you know what it will look like and that it will lead to a retirement you are happy with even if you have no capital?
If there is even a 5-10% chance that it doesn’t happen, or doesn’t provide what you want—that is a fail when I am doing a retirement plan for most of my clients. I’m generally aiming for a 0+epsilon or at least <1% chance of failure if the client is able to follow the plan[*]. The only clients where building in a 10% chance of bust is ok are those who are in a real pickle, and there is no reasonable strategy to do better. Those clients’ plans have to include downward adjustment of their goals if the initial trajectory is in or too close to the failure window.
[*] obviously most of the true failure chance happens when the client is unable to follow the plan at some point. Financially, some of that can be insured against (health and disability, life for dependent survivors) and some can’t.