In my head I’m making a much more specific claim than ‘mental fortitude’. A person can be able to deal with arguments and ideas from different sources well and differently — you can handle criticism from your manager but not your child, or you can sensibly evaluate arguments given to you at a whiteboard in-person but not in a fancy physics paper. I didn’t mean to say “bad at evaluating arguments in general”, I just meant specifically from one source. That said, I will consider your argument that I should be careful to criticize someone’s reasoning abilities… actually wait, I’m not sure I get it. Maybe there should be a high bar for doing this even on LW. I agree in most other places there is a high bar. I will aim to think more on it.
I’m not dismissing the position as nonsensical, and I’d be happy to engage with it if a LWer brought it up as their position. I said that not being able to see a perspective where it’s nonsense that bullies made up to paint you in a bad light is this issue. I think a pretty plausible story is “Huh, seems like this revered EA person just seems to have pretty aggressive and self-serving opinions about finance and power and crypto, as many corrupt people in finance probably do, and the proposal that this is a front in order to somehow affect EAs reputation (as if his front will really have much affect on whether EA turned out to have been led by one of the big fraudsters in history) is pretty silly”, and I think that’s definitely one of my main perspectives. My point isn’t that it’s bad to consider other opinions, my point is that it’s an issue to not be able to come up with something like this.
I currently have >50% on my read being right, but I may have mistakenly read into Ilverin’s comment, I definitely have >25% on that.
I intended to bring it up as plausible, but not explicitly say that I thought it was p>0.5 (because it wasn’t a firm belief and I didn’t want others to do any bayesian update). I wanted to read arguments about its plausibility. (Some pretty convincing arguments are SBF’s high level of luxury consumption and that he took away potentially all Alameda shares from the EA cofounder of Alameda, Tara Mac Aulay).
If it is plausible, even if it isn’t p>0.5, then it’s possible SBF wasn’t selfish, in which case that’s a reason for EA to focus more on inculcating philosophy in its members (whether the answer is “naive utilitarianism is wrong, use rule utilitarianism/virtue ethics/deontology” or “naive utilitarianism almost never advocates fraud”, etcetera) (some old and new preventive measures like EA forum posts do exist, maybe that’s enough or maybe not).
Quick responses:
In my head I’m making a much more specific claim than ‘mental fortitude’. A person can be able to deal with arguments and ideas from different sources well and differently — you can handle criticism from your manager but not your child, or you can sensibly evaluate arguments given to you at a whiteboard in-person but not in a fancy physics paper. I didn’t mean to say “bad at evaluating arguments in general”, I just meant specifically from one source. That said, I will consider your argument that I should be careful to criticize someone’s reasoning abilities… actually wait, I’m not sure I get it. Maybe there should be a high bar for doing this even on LW. I agree in most other places there is a high bar. I will aim to think more on it.
I’m not dismissing the position as nonsensical, and I’d be happy to engage with it if a LWer brought it up as their position. I said that not being able to see a perspective where it’s nonsense that bullies made up to paint you in a bad light is this issue. I think a pretty plausible story is “Huh, seems like this revered EA person just seems to have pretty aggressive and self-serving opinions about finance and power and crypto, as many corrupt people in finance probably do, and the proposal that this is a front in order to somehow affect EAs reputation (as if his front will really have much affect on whether EA turned out to have been led by one of the big fraudsters in history) is pretty silly”, and I think that’s definitely one of my main perspectives. My point isn’t that it’s bad to consider other opinions, my point is that it’s an issue to not be able to come up with something like this.
I currently have >50% on my read being right, but I may have mistakenly read into Ilverin’s comment, I definitely have >25% on that.
I intended to bring it up as plausible, but not explicitly say that I thought it was p>0.5 (because it wasn’t a firm belief and I didn’t want others to do any bayesian update). I wanted to read arguments about its plausibility. (Some pretty convincing arguments are SBF’s high level of luxury consumption and that he took away potentially all Alameda shares from the EA cofounder of Alameda, Tara Mac Aulay).
If it is plausible, even if it isn’t p>0.5, then it’s possible SBF wasn’t selfish, in which case that’s a reason for EA to focus more on inculcating philosophy in its members (whether the answer is “naive utilitarianism is wrong, use rule utilitarianism/virtue ethics/deontology” or “naive utilitarianism almost never advocates fraud”, etcetera) (some old and new preventive measures like EA forum posts do exist, maybe that’s enough or maybe not).