the examples that immediately jump to mind of different things about which people care (what Tim said) are not representative of their real values.
How do you know what their real values are? Even after everyone’s professed values get destroyed by the truth, it’s not at all clear to me that we end up in roughly the same place. Intellectuals like you or I might aspire to growing up to be a superintelligence, while others seem to care more about pleasure. By what standard are we right and they wrong? Configuration space is vast: however much humans might agree with each other on questions of value compared to an arbitrary mind (clustered as we are into a tiny dot of the space of all possible minds), we still disagree widely on all sorts of narrower questions (if you zoom in on the tiny dot, it becomes a vast globe, throughout which we are widely dispersed). And this applies on multiple scales: I might agree with you or Eliezer far more than I would with an arbitrary human (clustered as we are into a tiny dot of the space of human beliefs and values), but ask a still yet narrower question, and you’ll see disagreement again. I just don’t see how the granting of veridical knowledge is going to wipe away all this difference into triviality. Some might argue that while we can want all sorts of different things for ourselves, we might be able to agree on some meta-level principles on what we want to do: we could agree to have a diverse society. But this doesn’t seem likely to me either; that kind of type distinction doesn’t seem to be built into human values. What could possibly force that kind of convergence?
Even after everyone’s professed values get destroyed by the truth, it’s not at all clear to me that we end up in roughly the same place. Intellectuals like you or I might aspire to growing up to be a superintelligence, while others seem to care more about pleasure.
Your conclusion may be right, but the HedWeb isn’t strong evidence—as far as I recall David Pearce holds a philosophically flawed belief called “psychological hedonism” that says all humans are motivated by is pleasure and pain and therefore nothing else matters, or some such. So I would say that his moral system has not yet had to withstand a razing attempt from all the truth hordes that are out there roaming the Steppes of Fact.
How do you know what their real values are? Even after everyone’s professed values get destroyed by the truth, it’s not at all clear to me that we end up in roughly the same place. Intellectuals like you or I might aspire to growing up to be a superintelligence, while others seem to care more about pleasure. By what standard are we right and they wrong? Configuration space is vast: however much humans might agree with each other on questions of value compared to an arbitrary mind (clustered as we are into a tiny dot of the space of all possible minds), we still disagree widely on all sorts of narrower questions (if you zoom in on the tiny dot, it becomes a vast globe, throughout which we are widely dispersed). And this applies on multiple scales: I might agree with you or Eliezer far more than I would with an arbitrary human (clustered as we are into a tiny dot of the space of human beliefs and values), but ask a still yet narrower question, and you’ll see disagreement again. I just don’t see how the granting of veridical knowledge is going to wipe away all this difference into triviality. Some might argue that while we can want all sorts of different things for ourselves, we might be able to agree on some meta-level principles on what we want to do: we could agree to have a diverse society. But this doesn’t seem likely to me either; that kind of type distinction doesn’t seem to be built into human values. What could possibly force that kind of convergence?
Okay, I’m writing this one down.
Your conclusion may be right, but the HedWeb isn’t strong evidence—as far as I recall David Pearce holds a philosophically flawed belief called “psychological hedonism” that says all humans are motivated by is pleasure and pain and therefore nothing else matters, or some such. So I would say that his moral system has not yet had to withstand a razing attempt from all the truth hordes that are out there roaming the Steppes of Fact.