Humans exist in permanent “biased state”. The unbiased state is the province of Mr.Spock and Mr.Data, Vulcans and androids.
I think that rationality does not get rid of biases, but rather allows you to recognize them and compensate for them. Just like with e.g. fear—you rarely lose a particular fear altogether, you just learn to control and manage it.
You seem to mean that biases are the brains way to perceive the world in a way that focusses on the ‘important’ parts. Beside terminal goals which just evaluate the perception with respect to utility this acts acts as a filter but thereby also implies goals (namely the reduction of the importance of the filtered out parts).
Yes, but note that a lot of biases are universal to all humans. This means they are biological (as opposed to cultural) in nature. And this implies that the goals they developed to further are biological in nature as well. Which means that you are stuck with these goals whether you conscious mind likes it or not.
If your conscious mind has goals incompatible with the effects of bioneuropsychological processes then frustrations seems the least result.
I still don’t know about that. A collection of such “incompatible goals” has been described as civilization :-)
For example, things like “kill or drive away those-not-like-us” look like biologically hardwired goals to me. Having a conscious mind have its own goals incompatible with that one is probably a good thing.
Sure we have to deal with some of these inconsistencies. And for some of us this is an continuous source of frustration. But we do not have to add more to these than absolutely necessary, or?
Humans exist in permanent “biased state”. The unbiased state is the province of Mr.Spock and Mr.Data, Vulcans and androids.
I think that rationality does not get rid of biases, but rather allows you to recognize them and compensate for them. Just like with e.g. fear—you rarely lose a particular fear altogether, you just learn to control and manage it.
You seem to mean that biases are the brains way to perceive the world in a way that focusses on the ‘important’ parts. Beside terminal goals which just evaluate the perception with respect to utility this acts acts as a filter but thereby also implies goals (namely the reduction of the importance of the filtered out parts).
Yes, but note that a lot of biases are universal to all humans. This means they are biological (as opposed to cultural) in nature. And this implies that the goals they developed to further are biological in nature as well. Which means that you are stuck with these goals whether you conscious mind likes it or not.
Yes. That’s what I meant when I said: “You wouldn’t want to rationalize against your emotions. That will not work.”
If your conscious mind has goals incompatible with the effects of bioneuropsychological processes then frustrations seems the least result.
I still don’t know about that. A collection of such “incompatible goals” has been described as civilization :-)
For example, things like “kill or drive away those-not-like-us” look like biologically hardwired goals to me. Having a conscious mind have its own goals incompatible with that one is probably a good thing.
Sure we have to deal with some of these inconsistencies. And for some of us this is an continuous source of frustration. But we do not have to add more to these than absolutely necessary, or?