I misread a small bit, but I still stand by my answer. It is however still unclear to me if you value truth or not. You mention moral frameworks and opinions, but also sound like you want to get rid of biases? I think these conflict.
I guess I should give examples to show how I think:
Suppose that climate change is real, but that the proposed causes and solutions are wrong. Or that for some problem X, people call for solution Y, but you expect that Y will actually only make X worse (or be a pretend-solution which gives people a false sense of security and which is only adopted because it signals virtue)
Suppose that X is slightly bad, but not really worth bothering about, however, team A thinks that X is terrible and team B thinks that A is the best thing ever.
Suppose that something is entirely up to definition, such that truth doesn’t matter (for instance, if X is a mental illness or not). Also, suppose that whatever definition you choose will be perceived as either hatred or support.
I don’t think it’s good to get any opinions from the general population. If actual intelligent people are discussing an issue, they will likely have more nuanced takes than both the general population and the media.
Lets say that X personality trait is positively correlated with both intelligence and sexual deviancy. One side argues that this makes them good, another side argues that this makes them bad. Not only is this subjective, people would be confusing the utilitarian “good/bad” with the moral “good/bad” (easy example: Breaking a leg is bad, but having a broken leg does not make you a bad person).
I think being rational/unbiased results in breaking away from socities opinions about almost everything. I also think that being biased is to be human. The least biased of all is reality itself, and a lot of people seem really keen on fixing/correcting reality to be more moral. In my worldview, a lot of things stop making sense, so I don’t bother with them, and I wonder why other people are so bothered by so many things.
I might be unable to respond for a little while myself, sorry about that
To clarify the question: I’m taking as an axiom that there are just two options A or B. So this isn’t some big issue like “are democrats right on climate change” but something specific like “Is the climate changing”. Then conditional on that being yes, “is a significant portion of this change human-induced” would be a separate A vs. B. Then conditional on that being “yes”, we could get to “is X solution proposed going to do more harm than good according to my moral framework”.
About valuing truth or not: the idea is that one of those 5 options would hold for me if I was able to process all available “truth”, but my goal is to approximate the correct option without spending that much time. Does that clarify things?
For your bullet points:
addressed above
This is what I mean by “proportional to its importance” but an additional degree of uncertainty would be “how important is it”.
This would not fit the axioms posed, but it’s good that you brought that up. Arguing over definitions and semantics with edge cases is something that I’m only interested in if the question is if a certain law applies or not.
I overused the word bias in different settings. What I meant by “my intellectual sources could be biased” is that they could be biased in the same direction by hidden variables that I would want to control for. For instance my personal intellectual influences in a lot of areas are disproportionately Jewish and most pro-Palestine people I know are not people I look up to.
I hoped to get around this issue by fixing a moral framework. Do you think that doesn’t suffice?
If you don’t have time to reply for an arbitrarily long amount of time, I’d still be happy to continue the conversation (even if that’s like months from now)
While you could format questions in such a way that you can divide them into A and B in a sensible manner, my usual reaction to thought experiments which seem to make naive assumptions about reality is that the topic isn’t understood very deeply. The problem about looking at the surface (and this is mainly why average people don’t hold valuable opinions) is that people conclude that solar panels, windmills and electric cars are 100% “green”, without taking into account the production and recycling of these things. Many people think that charging stations for electric cars are green, but they don’t see the coal powerplant which supplies power to the charging station. In other words “Does X solution actually work?” is never asked. Society often acts like me when I’m being neurotic. When I say “I will clean my house next week” I allow my house to stay messy while also helping myself to forget the manner for now. But this is exactly like saying “We plan to be carbon neutral by 2040″ and then doing nothing for another 5 years.
And yes, that does clarify things!
Valid, but knowing what’s important might require understanding the problem in the first place. A lot of people want you to think that the thing they’re yelling about is really important.
Then the axioms do not account for a lot of controversial subjects. I think the abortion debate also depends on definitions “at how many weeks can the child said to be alive?” “When is it your own body and when is it another body living inside you?”
I’m afraid it doesn’t. I believe that morality has little to no correlation with intelligence, and that truth has little to do with morality. I’d go as far as to say that morality is one of the biases that people have, but you could call these “values” instead of biases.
To actually answer your question, I think understanding human nature and the people you’re speaking to is helpful. Also the advantages of pushing certain beliefs, and the advantages of holding certain beliefs.
If somebody grew up with really strict parents, they might value freedom, whereas somebody who lacked guidance might recognize the danger of doing whatever one feels like doing. And whether somebody leans left or right economically seems influenced by their own perceived ability to support themselves. Ones level of pity for others seems to be influenced by ones own confidence, since there’s a tendency to project ones own level of perceived fragility.
If you could measure a groups biases perfectly, then you could subtract it from the position they hold. If there’s strong reasons to lean towards X, but X is only winning by a little bit, then X might not be true. You can often also use reason to find inconsistencies. I’d go as far as saying that inconsistencies are obvious everywhere unless you unconsciously try to avoid seeing them. Discrimination based on inherent traits is wrong, but it’s socially acceptable to make fun of stupid people, ugly people, short people and weirdos? The real rule is obviously closer to something like “Discrimination is only acceptable towards those who are either perceived to be strong enough to handle it, or those who are deemed to be immoral in general”. If you think about it enough, you will likely find that most things people say are lies. There’s also some who have settled on “It’s all social status games and signaling” which is probably just another way of looking at the same thing. Speaking of thinking, if you start to deconstruct morality and questioning it, you might put yourself out of sync with other people permanently, so you’ve been warned.
But the best advice I can give is likely just to read the 10 or so strongest arguments you can find on both sides of the issue and then judging for yourself. If you can’t trust your own judgement, then you likely also can’t trust your own judgement about who you can trust to judge for you. And if you can judge this comment of mine, then you can likely judge peoples takes on things in general, and if you can’t judge this comment of mine, then you won’t be able to judge the advice you get about judging advice, and you’re stuck in a sort of loop.
I’m sometimes busy for a day or two, I don’t think I will have longer delays in replying than that
I misread a small bit, but I still stand by my answer. It is however still unclear to me if you value truth or not. You mention moral frameworks and opinions, but also sound like you want to get rid of biases? I think these conflict.
I guess I should give examples to show how I think:
Suppose that climate change is real, but that the proposed causes and solutions are wrong. Or that for some problem X, people call for solution Y, but you expect that Y will actually only make X worse (or be a pretend-solution which gives people a false sense of security and which is only adopted because it signals virtue)
Suppose that X is slightly bad, but not really worth bothering about, however, team A thinks that X is terrible and team B thinks that A is the best thing ever.
Suppose that something is entirely up to definition, such that truth doesn’t matter (for instance, if X is a mental illness or not). Also, suppose that whatever definition you choose will be perceived as either hatred or support.
I don’t think it’s good to get any opinions from the general population. If actual intelligent people are discussing an issue, they will likely have more nuanced takes than both the general population and the media.
Lets say that X personality trait is positively correlated with both intelligence and sexual deviancy. One side argues that this makes them good, another side argues that this makes them bad. Not only is this subjective, people would be confusing the utilitarian “good/bad” with the moral “good/bad” (easy example: Breaking a leg is bad, but having a broken leg does not make you a bad person).
I think being rational/unbiased results in breaking away from socities opinions about almost everything. I also think that being biased is to be human. The least biased of all is reality itself, and a lot of people seem really keen on fixing/correcting reality to be more moral. In my worldview, a lot of things stop making sense, so I don’t bother with them, and I wonder why other people are so bothered by so many things.
I might be unable to respond for a little while myself, sorry about that
To clarify the question: I’m taking as an axiom that there are just two options A or B. So this isn’t some big issue like “are democrats right on climate change” but something specific like “Is the climate changing”. Then conditional on that being yes, “is a significant portion of this change human-induced” would be a separate A vs. B. Then conditional on that being “yes”, we could get to “is X solution proposed going to do more harm than good according to my moral framework”.
About valuing truth or not: the idea is that one of those 5 options would hold for me if I was able to process all available “truth”, but my goal is to approximate the correct option without spending that much time. Does that clarify things?
For your bullet points:
addressed above
This is what I mean by “proportional to its importance” but an additional degree of uncertainty would be “how important is it”.
This would not fit the axioms posed, but it’s good that you brought that up. Arguing over definitions and semantics with edge cases is something that I’m only interested in if the question is if a certain law applies or not.
I overused the word bias in different settings. What I meant by “my intellectual sources could be biased” is that they could be biased in the same direction by hidden variables that I would want to control for. For instance my personal intellectual influences in a lot of areas are disproportionately Jewish and most pro-Palestine people I know are not people I look up to.
I hoped to get around this issue by fixing a moral framework. Do you think that doesn’t suffice?
If you don’t have time to reply for an arbitrarily long amount of time, I’d still be happy to continue the conversation (even if that’s like months from now)
While you could format questions in such a way that you can divide them into A and B in a sensible manner, my usual reaction to thought experiments which seem to make naive assumptions about reality is that the topic isn’t understood very deeply. The problem about looking at the surface (and this is mainly why average people don’t hold valuable opinions) is that people conclude that solar panels, windmills and electric cars are 100% “green”, without taking into account the production and recycling of these things. Many people think that charging stations for electric cars are green, but they don’t see the coal powerplant which supplies power to the charging station. In other words “Does X solution actually work?” is never asked. Society often acts like me when I’m being neurotic. When I say “I will clean my house next week” I allow my house to stay messy while also helping myself to forget the manner for now. But this is exactly like saying “We plan to be carbon neutral by 2040″ and then doing nothing for another 5 years.
And yes, that does clarify things!
Valid, but knowing what’s important might require understanding the problem in the first place. A lot of people want you to think that the thing they’re yelling about is really important.
Then the axioms do not account for a lot of controversial subjects. I think the abortion debate also depends on definitions “at how many weeks can the child said to be alive?” “When is it your own body and when is it another body living inside you?”
I’m afraid it doesn’t. I believe that morality has little to no correlation with intelligence, and that truth has little to do with morality. I’d go as far as to say that morality is one of the biases that people have, but you could call these “values” instead of biases.
To actually answer your question, I think understanding human nature and the people you’re speaking to is helpful. Also the advantages of pushing certain beliefs, and the advantages of holding certain beliefs.
If somebody grew up with really strict parents, they might value freedom, whereas somebody who lacked guidance might recognize the danger of doing whatever one feels like doing. And whether somebody leans left or right economically seems influenced by their own perceived ability to support themselves. Ones level of pity for others seems to be influenced by ones own confidence, since there’s a tendency to project ones own level of perceived fragility.
If you could measure a groups biases perfectly, then you could subtract it from the position they hold. If there’s strong reasons to lean towards X, but X is only winning by a little bit, then X might not be true. You can often also use reason to find inconsistencies. I’d go as far as saying that inconsistencies are obvious everywhere unless you unconsciously try to avoid seeing them. Discrimination based on inherent traits is wrong, but it’s socially acceptable to make fun of stupid people, ugly people, short people and weirdos? The real rule is obviously closer to something like “Discrimination is only acceptable towards those who are either perceived to be strong enough to handle it, or those who are deemed to be immoral in general”. If you think about it enough, you will likely find that most things people say are lies. There’s also some who have settled on “It’s all social status games and signaling” which is probably just another way of looking at the same thing. Speaking of thinking, if you start to deconstruct morality and questioning it, you might put yourself out of sync with other people permanently, so you’ve been warned.
But the best advice I can give is likely just to read the 10 or so strongest arguments you can find on both sides of the issue and then judging for yourself. If you can’t trust your own judgement, then you likely also can’t trust your own judgement about who you can trust to judge for you. And if you can judge this comment of mine, then you can likely judge peoples takes on things in general, and if you can’t judge this comment of mine, then you won’t be able to judge the advice you get about judging advice, and you’re stuck in a sort of loop.
I’m sometimes busy for a day or two, I don’t think I will have longer delays in replying than that