You may draw what conclusions you like! It’s not my intention to defend EA here.
Here’s an attempt to clarify my outlook, though my words might not succeed:
To the extent EA builds up idealized molds to shove people into to extract value from them, this is fucked up. To the extent that EA then pretends people like Sam or others in power fit the same mold, this is extra fucked up. Both these things look to me to me rampant in EA. I don’t like it.
That does clarify where you’re coming from. I made my comment because it seems to me that it would be a shame for people to fall into one of the more obvious attractors for reasoning within EA about the SBF situation. E.G., an attractor labelled something like “SBF’s actions were not part of EA because EA doesn’t do those Bad Things”.
Which is basically on the greatest hits list for how (not necessarily centrally unified) groups of humans have defended themselves from losing cohesion over the actions of a subset anytime in recorded history. Some portion of the reasoning on SBF in the past week looks motivated in service of the above.
The following isn’t really pointed at you, just my thoughts on the situation.
I think that there’s nearly unavoidable tension with trying to float arguments that deal with the optics of SBF’s connection to EA, from within EA. Which is a thing that is explicitly happening in this thread. Standards of epistemic honesty are in conflict with the group need to hold together. While the truth of the matter is and may remain uncertain, if SBF’s fraud was motivated wholly or in part by EA principles, that connection should be taken seriously.
My personal opinion is that, the more I think about it, the more obvious it seems that several cultural features of LW adjacent EA are really ideal for generating extremist behavior. People are forming consensus thought groups around moral calculations that explicitly marginalize the value of all living people, to say nothing of the extreme side of negative consequentialism. This is all in an overall environment of iconoclasm and disregarding established norms in favor of taking new ideas to their logical conclusion.
These are being held in an equilibrium by stabilizing norms. At the risk of stating the obvious, insofar as the group in question is a group at all, it is heterogeneous; the cultural features I’m talking about are also some of the unique positive values of EA. But these memes have sharp edges.
You may draw what conclusions you like! It’s not my intention to defend EA here.
Here’s an attempt to clarify my outlook, though my words might not succeed:
To the extent EA builds up idealized molds to shove people into to extract value from them, this is fucked up. To the extent that EA then pretends people like Sam or others in power fit the same mold, this is extra fucked up. Both these things look to me to me rampant in EA. I don’t like it.
That does clarify where you’re coming from. I made my comment because it seems to me that it would be a shame for people to fall into one of the more obvious attractors for reasoning within EA about the SBF situation.
E.G., an attractor labelled something like “SBF’s actions were not part of EA because EA doesn’t do those Bad Things”.
Which is basically on the greatest hits list for how (not necessarily centrally unified) groups of humans have defended themselves from losing cohesion over the actions of a subset anytime in recorded history. Some portion of the reasoning on SBF in the past week looks motivated in service of the above.
The following isn’t really pointed at you, just my thoughts on the situation.
I think that there’s nearly unavoidable tension with trying to float arguments that deal with the optics of SBF’s connection to EA, from within EA. Which is a thing that is explicitly happening in this thread. Standards of epistemic honesty are in conflict with the group need to hold together. While the truth of the matter is and may remain uncertain, if SBF’s fraud was motivated wholly or in part by EA principles, that connection should be taken seriously.
My personal opinion is that, the more I think about it, the more obvious it seems that several cultural features of LW adjacent EA are really ideal for generating extremist behavior. People are forming consensus thought groups around moral calculations that explicitly marginalize the value of all living people, to say nothing of the extreme side of negative consequentialism. This is all in an overall environment of iconoclasm and disregarding established norms in favor of taking new ideas to their logical conclusion.
These are being held in an equilibrium by stabilizing norms. At the risk of stating the obvious, insofar as the group in question is a group at all, it is heterogeneous; the cultural features I’m talking about are also some of the unique positive values of EA. But these memes have sharp edges.