It seems to me that many political arguments are not about values, e.g. people both for and against nuclear power will typically argue that their way of doing things will be better for the environment. My guess is that nuclear power debates would be more productive if participants did not see things in terms of conflict. Do you disagree?
Elsewhere in this thread you say that politics brings out the worst in you rationality-wise. It doesn’t sound like you agree with my proposed explanation for why this might be happening. I’d be quite curious to hear yours.
It seems to me that many political arguments are not about values
Political conflict comes from conflicts over facts and values. Facts are relatively easy to establish. Values simply conflict.
people both for and against nuclear power will typically argue that their way of doing things will be better for the environment.
Where “better” drops the context of “better according to my values”, so that better to me is not better to you. Better is a value judgment, and our values are not identical.
People are hopeless to talk politics with until they grok this.
The first thing to do in any honest negotiation is to mutually communicate your values.
Elsewhere in this thread you say that politics brings out the worst in you rationality-wise.
People tend to think poorly when something is on the line, in conflict with what others have on the line. But there is a conceptual difficulty prior to that, where they mistake their preferences for facts of the universe, equally applicable to all.
It’s difficult to be rational when you’re in conflict with others about significant values. It’s next to impossible if your fundamental concepts structurally commit you to error about the reality of the conflict.
Political conflict comes from conflicts over facts and values. Facts are relatively easy to establish. Values simply conflict.
Not only that—there are also models (which, for the purposes of this thread, we can define as maps that produce forecasts).
To reuse the example in the grandfather post, Alice and Bob arguing about nuclear power could have exactly the same values and agree about the facts. However Alice has a model which forecasts that in a hundred years nuclear power leads to radioactive deserts and Bob has a model which forecasts that in a hundred years nuclear power leads to nothing but some safely hidden away containers with radioactive waste.
Alice and Bob differ in their expectations of the future—that’s neither facts nor values.
(Yes, I’m familiar with the Aumann’s Theorem, but it just doesn’t work in reality)
where they mistake their preferences for facts of the universe
Yes, I agree it’s really hard to talk to people who don’t realize this.
Tell that to a scientist (one who establishes facts as a profession).
Where “better” drops the context of “better according to my values”, so that better to me is not better to you. Better is a value judgment, and our values are not identical.
I disagree this is the case for folks who argue about nuclear power.
You’ve become more accurate in your assessment of the situation.
That’s not my ideal.
It seems to me that many political arguments are not about values, e.g. people both for and against nuclear power will typically argue that their way of doing things will be better for the environment. My guess is that nuclear power debates would be more productive if participants did not see things in terms of conflict. Do you disagree?
Elsewhere in this thread you say that politics brings out the worst in you rationality-wise. It doesn’t sound like you agree with my proposed explanation for why this might be happening. I’d be quite curious to hear yours.
Political conflict comes from conflicts over facts and values. Facts are relatively easy to establish. Values simply conflict.
Where “better” drops the context of “better according to my values”, so that better to me is not better to you. Better is a value judgment, and our values are not identical.
People are hopeless to talk politics with until they grok this.
The first thing to do in any honest negotiation is to mutually communicate your values.
People tend to think poorly when something is on the line, in conflict with what others have on the line. But there is a conceptual difficulty prior to that, where they mistake their preferences for facts of the universe, equally applicable to all.
It’s difficult to be rational when you’re in conflict with others about significant values. It’s next to impossible if your fundamental concepts structurally commit you to error about the reality of the conflict.
Not only that—there are also models (which, for the purposes of this thread, we can define as maps that produce forecasts).
To reuse the example in the grandfather post, Alice and Bob arguing about nuclear power could have exactly the same values and agree about the facts. However Alice has a model which forecasts that in a hundred years nuclear power leads to radioactive deserts and Bob has a model which forecasts that in a hundred years nuclear power leads to nothing but some safely hidden away containers with radioactive waste.
Alice and Bob differ in their expectations of the future—that’s neither facts nor values.
(Yes, I’m familiar with the Aumann’s Theorem, but it just doesn’t work in reality)
Yes, I agree it’s really hard to talk to people who don’t realize this.
Yeah, it’s probably worthwhile to separate out models and their predictions from facts.
Tell that to a scientist (one who establishes facts as a profession).
I disagree this is the case for folks who argue about nuclear power.