I am not sure how obvious is the part that there are multiple possible futures. Most likely, the AI would not be able to model all of them. However, without AI most of them wouldn’t happen anyway.
It’s like saying “if I don’t roll a die, I lose the chance of rolling 6”, to which I add “and if you do roll the die, you still have 5⁄6 probability of not rolling 6″. Just to make it clear that by avoiding the “spontaneous” future of humankind, we are not avoiding one specific future magically prepared for us by destiny. We are avoiding the whole probability distribution, which contains many possible futures, both nice and ugly.
Just because AI can model something imperfectly, it does not mean that without the AI the future would be perfect, or even better on average than with the AI.
‘Unmediated’ may not have been quite the word to convey what I meant.
My impression is that CEV is permanently established very early in the AI’s history, but I believe that what people are and want (including what we would want if we knew more, thought faster, were more the people we wished we were, and had grown up closer together) will change, both because people will be doing self-modification and because they will learn more.
I am not sure how obvious is the part that there are multiple possible futures. Most likely, the AI would not be able to model all of them. However, without AI most of them wouldn’t happen anyway.
It’s like saying “if I don’t roll a die, I lose the chance of rolling 6”, to which I add “and if you do roll the die, you still have 5⁄6 probability of not rolling 6″. Just to make it clear that by avoiding the “spontaneous” future of humankind, we are not avoiding one specific future magically prepared for us by destiny. We are avoiding the whole probability distribution, which contains many possible futures, both nice and ugly.
Just because AI can model something imperfectly, it does not mean that without the AI the future would be perfect, or even better on average than with the AI.
‘Unmediated’ may not have been quite the word to convey what I meant.
My impression is that CEV is permanently established very early in the AI’s history, but I believe that what people are and want (including what we would want if we knew more, thought faster, were more the people we wished we were, and had grown up closer together) will change, both because people will be doing self-modification and because they will learn more.