because it is branded with the ea movement by being lesswrong.com. it cannot be unbranded except by changing the associations people actually make. the true position in the latent network relationships online makes it associated. you may not be aware of your position in a larger organism, but that doesn’t mean you aren’t in one just because you only want to focus on the contents of your own cell; if you insist on not thinking about the larger organisms you participate in then that’s alright, but it makes you a skin cell, not a nerve cell.
edit: I suppose a basic underlying viewpoint I have is that all signaling is done by taking actions, and the only actions worth taking are ones that send signals into the universe that shape the universe towards the forms you wish it to have. lifting something off the ground is signaling, and signaling is measured in watts. false signals are lying, don’t do those, they’re worse than useless—putting map signals into another brain that do not match the signals you’re sending into the territory is dishonesty, and the false signals themselves are the thing which is under question which need to be repaired into honesty by example.
because it is branded with the ea movement by being lesswrong.com.
What does the name “lesswrong” have to do with EA? There’s a certain overlap between the two communities, but LessWrong’s mission has nothing to do with EA specifically. To the extent that it has any mission other than the one on its face, raising the sanity waterline, then historically that mission — Eliezer’s mission — was to get people to think properly about AI and avert the coming doom.
FWIW, I am not and never have been an EA and do not read or participate in EA forums, but I’ve been on LW since it began on OvercomingBias. If it became “an enforcement arm of the branded “EA” movement, accountable for all its sins” I would leave.
because it is branded with the ea movement by being lesswrong.com. it cannot be unbranded except by changing the associations people actually make. the true position in the latent network relationships online makes it associated. you may not be aware of your position in a larger organism, but that doesn’t mean you aren’t in one just because you only want to focus on the contents of your own cell; if you insist on not thinking about the larger organisms you participate in then that’s alright, but it makes you a skin cell, not a nerve cell.
edit: I suppose a basic underlying viewpoint I have is that all signaling is done by taking actions, and the only actions worth taking are ones that send signals into the universe that shape the universe towards the forms you wish it to have. lifting something off the ground is signaling, and signaling is measured in watts. false signals are lying, don’t do those, they’re worse than useless—putting map signals into another brain that do not match the signals you’re sending into the territory is dishonesty, and the false signals themselves are the thing which is under question which need to be repaired into honesty by example.
What does the name “lesswrong” have to do with EA? There’s a certain overlap between the two communities, but LessWrong’s mission has nothing to do with EA specifically. To the extent that it has any mission other than the one on its face, raising the sanity waterline, then historically that mission — Eliezer’s mission — was to get people to think properly about AI and avert the coming doom.
FWIW, I am not and never have been an EA and do not read or participate in EA forums, but I’ve been on LW since it began on OvercomingBias. If it became “an enforcement arm of the branded “EA” movement, accountable for all its sins” I would leave.