I notice-1 that this carries an implicit claim that claims about reality, rather than one’s own feelings and experiences, are not valid. I don’t think Val actually thinks this, but it’s a super scary thing, both because its implications are awful and a lot of people (not Val!) seem to actually believe this or argue for this. That one should say “I observe that I have a belief that the sky is blue.”
Thus, I have a very hostile emotional reaction to responding to “X is bad” with “I think that what’s going on is that Y is going on inside your brain making you have the emotional reaction that X is bad, can you say more about this but only talk in this fashion?” especially to someone explicitly rejecting this frame, and in fact in this conversation in order to argue against the frame.
What? Look, if you and I are having a conversation about animals and I bring up bats and you go “bats are evil and anyone who approves of them is evil” (which is an actual word PDV actually used in this conversation), I think it’s a reasonable response for me to go “uh, bro, are you okay? It sounds like you’ve got a thing about bats,” and we don’t have to go into it if you don’t want to but refusing to acknowledge the thing that just happened seems weird to me, even if I only care about epistemics, because if I’m right then everything you say about bats needs to be filtered to take into account that, I dunno, bats killed your family or whatever (and this consideration is orthogonal to respecting your boundaries around bats, whatever they are).
(Also, probably goes without saying, but just in case: I don’t think Val is making anything like this claim, and I think “but only talk in this fashion” is a strawman. I do still think there was something not ideal about Val using an NVC-ish frame here but I’m also sympathetic to his defense.)
Look, if you and I are having a conversation about animals and I bring up bats and you go “bats are evil and anyone who approves of them is evil” (which is an actual word PDV actually used in this conversation), I think it’s a reasonable response for me to go “uh, bro, are you okay? It sounds like you’ve got a thing about bats,”
I strenuously disagree. This would be an extremely annoying sort of response, and I would think less of anyone who responded like this.
People can have strong opinions without those opinions coming from, like, emotional trauma or whatever. Insinuating some irrational, emotional motivation for a belief, in lieu of discussing the belief itself or asking how someone came to have it, etc., is simply rude.
(It’s different if you explicitly say “you’re wrong, and also, you only believe that because of [insert bad reason here]”. But that’s not what you’re doing, in your bat-hypothetical!)
I can’t emphasize enough how important the thing you’re mentioning here is, and I believe it points to the crux of the issue more directly than most other things that have been said so far.
We can often weakman postmodernism as making basically the same claim, but this doesn’t change the fact that a lot of people are running an algorithm in their head with the textual description “there is no outside reality, only things that happen in my mind.” This algorithm seems to produce different behaviors in people than if they were running the algorithm “outside reality exists and is important.” I think the first algorithm tends to produce behaviors that are a lot more dangerous than the latter, even though it’s always possible to make philosophical arguments that make one algorithm seem much more likely to be “true” than the other. It’s crucial to realize that not everyone is running the perfectly steelmanned version of such algorithms to do with updating our beliefs based on observations of the processes of how we update on our beliefs, and such things are very tricky to get right.
Even though it’s valid to make observations of the form “I observe that I am running a process that produces the belief X in me”, it is definitely very risky to create a social norm that says such statements are superior to statements like “X is true” because such norms create the tendency to assign less validity to statements like “X is true”. In other words, such a norm can itself become a process that produces the belief “X is not true” when we don’t necessarily want to move our beliefs on X just because we begin to understand how the processes work. It’s very easy to go from “X is true” to “I observe I believe X is true” to “I observe there are social and emotional influences on my beliefs” to “There are social and emotional influences on my belief in X” to finally “X is not true” and I can’t help but feel a mistake is being made somewhere in that process.
I notice-1 that this carries an implicit claim that claims about reality, rather than one’s own feelings and experiences, are not valid. I don’t think Val actually thinks this, but it’s a super scary thing, both because its implications are awful and a lot of people (not Val!) seem to actually believe this or argue for this. That one should say “I observe that I have a belief that the sky is blue.”
Thus, I have a very hostile emotional reaction to responding to “X is bad” with “I think that what’s going on is that Y is going on inside your brain making you have the emotional reaction that X is bad, can you say more about this but only talk in this fashion?” especially to someone explicitly rejecting this frame, and in fact in this conversation in order to argue against the frame.
What? Look, if you and I are having a conversation about animals and I bring up bats and you go “bats are evil and anyone who approves of them is evil” (which is an actual word PDV actually used in this conversation), I think it’s a reasonable response for me to go “uh, bro, are you okay? It sounds like you’ve got a thing about bats,” and we don’t have to go into it if you don’t want to but refusing to acknowledge the thing that just happened seems weird to me, even if I only care about epistemics, because if I’m right then everything you say about bats needs to be filtered to take into account that, I dunno, bats killed your family or whatever (and this consideration is orthogonal to respecting your boundaries around bats, whatever they are).
(Also, probably goes without saying, but just in case: I don’t think Val is making anything like this claim, and I think “but only talk in this fashion” is a strawman. I do still think there was something not ideal about Val using an NVC-ish frame here but I’m also sympathetic to his defense.)
I strenuously disagree. This would be an extremely annoying sort of response, and I would think less of anyone who responded like this.
People can have strong opinions without those opinions coming from, like, emotional trauma or whatever. Insinuating some irrational, emotional motivation for a belief, in lieu of discussing the belief itself or asking how someone came to have it, etc., is simply rude.
(It’s different if you explicitly say “you’re wrong, and also, you only believe that because of [insert bad reason here]”. But that’s not what you’re doing, in your bat-hypothetical!)
I can’t emphasize enough how important the thing you’re mentioning here is, and I believe it points to the crux of the issue more directly than most other things that have been said so far.
We can often weakman postmodernism as making basically the same claim, but this doesn’t change the fact that a lot of people are running an algorithm in their head with the textual description “there is no outside reality, only things that happen in my mind.” This algorithm seems to produce different behaviors in people than if they were running the algorithm “outside reality exists and is important.” I think the first algorithm tends to produce behaviors that are a lot more dangerous than the latter, even though it’s always possible to make philosophical arguments that make one algorithm seem much more likely to be “true” than the other. It’s crucial to realize that not everyone is running the perfectly steelmanned version of such algorithms to do with updating our beliefs based on observations of the processes of how we update on our beliefs, and such things are very tricky to get right.
Even though it’s valid to make observations of the form “I observe that I am running a process that produces the belief X in me”, it is definitely very risky to create a social norm that says such statements are superior to statements like “X is true” because such norms create the tendency to assign less validity to statements like “X is true”. In other words, such a norm can itself become a process that produces the belief “X is not true” when we don’t necessarily want to move our beliefs on X just because we begin to understand how the processes work. It’s very easy to go from “X is true” to “I observe I believe X is true” to “I observe there are social and emotional influences on my beliefs” to “There are social and emotional influences on my belief in X” to finally “X is not true” and I can’t help but feel a mistake is being made somewhere in that process.