I don’t consider managing people’s emotions to be part of the subject matter of epistemic rationality,
This sounds to me like an extremely large mistake. Emotions sure do seem to be the rate-limiting factor for epistemically productive interaction in a very large fraction of situations, and therefore managing them is a very central issue for human epistemic rationality in practice.
Zach’s post is not vibe-neutral because nothing is vibe-neutral. There’s a subtextual claim that: 1. when people criticize your arguments you should take it as a gift 2. when you criticise other people’s opinions you should present it as a gift. 3. when “debating” be chill, as if you are at the grocery store check-out
I think this is a good strategy, and that (2) actually can succeed at at quelling bad emotional reaction. If you present an argument as an attack, or prematurely apologize for attacking, it will be felt like an attack. If you just present it with kindness, people will realize you mean no harm. If you present it with a detached professional “objectivity” and like actually feel [i just care about the truth] then … well some people would still react badly but it should usually be fine. could be done with a bit more finesse maybe.
There’s also 4. this is the right frame that people who read LW ought to take to debates with other people who read LW. Which I also agree with.
[I’m probably reading into Zach’s writing stuff that he didn’t intend to imply. But death of the author; I’m following the advice of the post]
It could be a mistake, or it could be an attempt to leave room for plausible deniability. There is something in his category war that doesn’t quite add up. I don’t know what the solution is; one of my main hypotheses is, as you say, that he is making some extremely large mistakes (not just wrt managing people’s emotions, also object-level mistakes), but another is that cancel culture will punish him too much if he engages in fully transparent discourse, and therefore some things that look like mistakes are actually intentional to obscure what’s going on.
Managing your own emotions is clearly a prerequisite to successful epistemic rationality practices, but other people’s emotions? That seems straightforwardly irrelevant.
What do you see as the prototypical problem in epistemic rationality? I see the prototypical problem as being creating an environment of collaborative truth-seeking, and there managing other’s emotions is perfectly relevant.
This sounds to me like an extremely large mistake. Emotions sure do seem to be the rate-limiting factor for epistemically productive interaction in a very large fraction of situations, and therefore managing them is a very central issue for human epistemic rationality in practice.
Zach’s post is not vibe-neutral because nothing is vibe-neutral. There’s a subtextual claim that: 1. when people criticize your arguments you should take it as a gift 2. when you criticise other people’s opinions you should present it as a gift. 3. when “debating” be chill, as if you are at the grocery store check-out
I think this is a good strategy, and that (2) actually can succeed at at quelling bad emotional reaction. If you present an argument as an attack, or prematurely apologize for attacking, it will be felt like an attack. If you just present it with kindness, people will realize you mean no harm. If you present it with a detached professional “objectivity” and like actually feel [i just care about the truth] then … well some people would still react badly but it should usually be fine. could be done with a bit more finesse maybe.
There’s also 4. this is the right frame that people who read LW ought to take to debates with other people who read LW. Which I also agree with.
[I’m probably reading into Zach’s writing stuff that he didn’t intend to imply. But death of the author; I’m following the advice of the post]
It could be a mistake, or it could be an attempt to leave room for plausible deniability. There is something in his category war that doesn’t quite add up. I don’t know what the solution is; one of my main hypotheses is, as you say, that he is making some extremely large mistakes (not just wrt managing people’s emotions, also object-level mistakes), but another is that cancel culture will punish him too much if he engages in fully transparent discourse, and therefore some things that look like mistakes are actually intentional to obscure what’s going on.
Also to some extent it’s just correct. If people are emotionally manipulating him, he has to drop out of managing their emotions.
Managing your own emotions is clearly a prerequisite to successful epistemic rationality practices, but other people’s emotions? That seems straightforwardly irrelevant.
What do you see as the prototypical problem in epistemic rationality? I see the prototypical problem as being creating an environment of collaborative truth-seeking, and there managing other’s emotions is perfectly relevant.
You need to so!ve epistemology, and you need an epistemology to solve epistemology.