fwiw I think gears’ comment is sort of directionally right.
I think there is something important Oli and John were correct to be defending on LessWrong, and epistemic culture, and prevent things from moving towards higher simulacra levels, etc. But, also the LessWrong-style way of doing things feels kinda stunted at coordination.
There’s sort of a package deal that (much of) society offers on how to do moral coordination (see: “Simulacrum 3 As Stag-Hunt Strategy”), which has a lot of problems, both epistemically, strategically and morally. My sense is that LW-er types are often trying to roll their own coordination schemes, and this will (hopefully) eventually result in something better and epistemic/lawfully grounded. But in the meantime it means there’s a lot of obvious tools we don’t have access to (including “interfacing morally/coordinationaly with much of the rest of the world”, which is one of the key points of, well, morality and coordination).
I endorse making the overall tradeoff, but it seems like it should come with more awareness that… like, we’re making a tradeoff by having our memetic immune system trigger this hard. Not just uniformly choosing a better option.
...
Followup note: there’s a distinction between “what is right for LessWrong” and “what is right for the broader rationalsphere on EA Forum and Twitter and stuff.” I think Oli had criticized both, separately. LessWrong is optimizing especially hard for epistemics/intellectual-progress as much as possible, and I think that’s correct. It’s less obvious to me whether it’s bad that this post got 600 karma on EA Forum. In my dream world, the whole EAcosystem has better coordination and/or morality tech that doesn’t route with Simulacrum 3 signaling games that are vulnerable to co-option. But I think we’re still in an uncanny valley of coordination theory/practice and not sure what the right approach is for non-LW discourse, in the meantime.
I don’t actually have a strong belief that the OP is good at the goal it’s trying to accomplish. Just, the knee-jerk reaction to it feels like it has a missing mood to me.
fwiw I think gears’ comment is sort of directionally right.
I think there is something important Oli and John were correct to be defending on LessWrong, and epistemic culture, and prevent things from moving towards higher simulacra levels, etc. But, also the LessWrong-style way of doing things feels kinda stunted at coordination.
There’s sort of a package deal that (much of) society offers on how to do moral coordination (see: “Simulacrum 3 As Stag-Hunt Strategy”), which has a lot of problems, both epistemically, strategically and morally. My sense is that LW-er types are often trying to roll their own coordination schemes, and this will (hopefully) eventually result in something better and epistemic/lawfully grounded. But in the meantime it means there’s a lot of obvious tools we don’t have access to (including “interfacing morally/coordinationaly with much of the rest of the world”, which is one of the key points of, well, morality and coordination).
I endorse making the overall tradeoff, but it seems like it should come with more awareness that… like, we’re making a tradeoff by having our memetic immune system trigger this hard. Not just uniformly choosing a better option.
...
Followup note: there’s a distinction between “what is right for LessWrong” and “what is right for the broader rationalsphere on EA Forum and Twitter and stuff.” I think Oli had criticized both, separately. LessWrong is optimizing especially hard for epistemics/intellectual-progress as much as possible, and I think that’s correct. It’s less obvious to me whether it’s bad that this post got 600 karma on EA Forum. In my dream world, the whole EAcosystem has better coordination and/or morality tech that doesn’t route with Simulacrum 3 signaling games that are vulnerable to co-option. But I think we’re still in an uncanny valley of coordination theory/practice and not sure what the right approach is for non-LW discourse, in the meantime.
I don’t actually have a strong belief that the OP is good at the goal it’s trying to accomplish. Just, the knee-jerk reaction to it feels like it has a missing mood to me.
(Upvote-disagree.)