I stopped arguing politics when I noticed that my beliefs and values had, at some point in the decade of arguing politics, converged.
I returned briefly once more because one of the forum members asked me to intervene in some ugliness that was taking place—one of the people who was nominally on my side had turned rabid in my absence—but otherwise haven’t returned to that pastime.
Since then, my political beliefs have shifted (in some ways, I’m more extreme, in others, more moderate) absent the continual pressure of refinement by argument, but they’ve diverged from my values again.
I’m not certain whether that makes them more correct, or less, however. The winnowing of all unnecessary elements may have simply revealed the values underlying the beliefs, rather than the constant defense of them driving my values to align with my beliefs.
Having beliefs and values converge to the truth is the desired outcome. The trick is knowing if the convergence is to the truth, or just the shortest line projection between the two.
Whether in science or law, truth producing activities tend to be adversarial. If done honestly and with commitment, wtih capable adversaries, that’s a pretty good system. If you care enough to spend the effort, and have capable and similarly committed adversaries available, I think that’s a much better recipe for coming to the truth than stewing in the juices of your own beliefs and the beliefs of your tribe.
Since you were talking about values in a political context, I assumed they were political values, which usually presuppose some facts. Policies are rarely preferred entirely deontologically.
Do you have specific examples of the values you were referring to?
In order to help you, I have to know what you need help doing.
I know the fact value distinction.
You suggested political values (which I’m re-interpreting as either “value” or “policy preference”) presuppose facts. I think our definition of “value” must diverge if that is what you think is the case, and assume you are referring to policy preferences instead.
I’m asking for specific examples so I can understand how you personally apply that.
I’m not sure what you’re asking for examples of, but here are some of my values:
Honesty.
Correctness.
Efficiency.
Here are some of my policy preferences:
Free speech (including lies or simple wrongness).
Bodily autonomy (including abortion, drug usage, and sexuality).
Market autonomy (that is, what is commonly referred to as capitalism).
The underlying axiom underneath my policy preferences is autonomy and self-responsibility. My central personal value is integrity.
I stopped arguing politics when I noticed that my beliefs and values had, at some point in the decade of arguing politics, converged.
I returned briefly once more because one of the forum members asked me to intervene in some ugliness that was taking place—one of the people who was nominally on my side had turned rabid in my absence—but otherwise haven’t returned to that pastime.
Since then, my political beliefs have shifted (in some ways, I’m more extreme, in others, more moderate) absent the continual pressure of refinement by argument, but they’ve diverged from my values again.
I’m not certain whether that makes them more correct, or less, however. The winnowing of all unnecessary elements may have simply revealed the values underlying the beliefs, rather than the constant defense of them driving my values to align with my beliefs.
Having beliefs and values converge to the truth is the desired outcome. The trick is knowing if the convergence is to the truth, or just the shortest line projection between the two.
Whether in science or law, truth producing activities tend to be adversarial. If done honestly and with commitment, wtih capable adversaries, that’s a pretty good system. If you care enough to spend the effort, and have capable and similarly committed adversaries available, I think that’s a much better recipe for coming to the truth than stewing in the juices of your own beliefs and the beliefs of your tribe.
Having your beliefs converge on the truth is the desired outcome.
Values don’t have a truthiness property. If your beliefs and your values converge, something else is going on.
Since you were talking about values in a political context, I assumed they were political values, which usually presuppose some facts. Policies are rarely preferred entirely deontologically.
Do you have specific examples of the values you were referring to?
Policies aren’t values. Values are those things which cause an individual to choose which policies to support.
That’s not helping me at all.
I know the fact value distinction. I’m asking for specific examples so I can understand how you personally apply that.
In order to help you, I have to know what you need help doing.
You suggested political values (which I’m re-interpreting as either “value” or “policy preference”) presuppose facts. I think our definition of “value” must diverge if that is what you think is the case, and assume you are referring to policy preferences instead.
I’m not sure what you’re asking for examples of, but here are some of my values:
Honesty. Correctness. Efficiency.
Here are some of my policy preferences:
Free speech (including lies or simple wrongness). Bodily autonomy (including abortion, drug usage, and sexuality). Market autonomy (that is, what is commonly referred to as capitalism).
The underlying axiom underneath my policy preferences is autonomy and self-responsibility. My central personal value is integrity.