Is it possible to say ‘your actions are bad and maybe you should stop’ or even ‘your actions are having these results and maybe you should stop’ without saying ‘you are bad and you should feel bad’?
I actually am asking, because I don’t know.
I’ve touched on this elsethread, but my actual answer is that if you want to do that, you either need to create a dedicated space of trust for it, that people have bought into. Or you need to continuously invest effort in it. And yes, that sucks. It’s hugely inefficient. But I don’t actually see alternatives.
It sucks even more because it’s probably anti-inductive, where as some phrases become commonly understood they later become carrier waves for subtle barbs and political manipulations. (I’m not confident how common this is. I think a more prototypical example is “southern politeness” with “Oh bless your heart”).
So I don’t think there’s a permanent answer for public discourse. There’s just costly signaling via phrasing things carefully in a way that suggests you’re paying attention to your reader’s mental state (including their mental map of the current landscape of social moves people commonly pull) and writing things that expressly work to build trust given that mental state.
(Duncan’s more recent writing often seems to be making an effort at this. It doesn’t work universally, due to the unfortunate fact that not all one’s readers will be having the same mental state. A disclaimer that reassures one person may alienate another)
It seems… hypothetically possible for LessWrong to someday establish this sort of trust, but I think it actually requires hours and hours of doublecrux for each pair of people with different worldviews, and then that trust isn’t necessarily transitive between the next pair of people with different different worldviews. (Worldviews which affect what even seem like reasonable meta-level norms within the paradigm of ‘we’re all here to truthseek’. See tensions in truthseeking for some [possibly out of date] thoughts on mine on that)
I’ve touched on this elsethread, but my actual answer is that if you want to do that, you either need to create a dedicated space of trust for it, that people have bought into. Or you need to continuously invest effort in it. And yes, that sucks. It’s hugely inefficient. But I don’t actually see alternatives.
It sucks even more because it’s probably anti-inductive, where as some phrases become commonly understood they later become carrier waves for subtle barbs and political manipulations. (I’m not confident how common this is. I think a more prototypical example is “southern politeness” with “Oh bless your heart”).
So I don’t think there’s a permanent answer for public discourse. There’s just costly signaling via phrasing things carefully in a way that suggests you’re paying attention to your reader’s mental state (including their mental map of the current landscape of social moves people commonly pull) and writing things that expressly work to build trust given that mental state.
(Duncan’s more recent writing often seems to be making an effort at this. It doesn’t work universally, due to the unfortunate fact that not all one’s readers will be having the same mental state. A disclaimer that reassures one person may alienate another)
It seems… hypothetically possible for LessWrong to someday establish this sort of trust, but I think it actually requires hours and hours of doublecrux for each pair of people with different worldviews, and then that trust isn’t necessarily transitive between the next pair of people with different different worldviews. (Worldviews which affect what even seem like reasonable meta-level norms within the paradigm of ‘we’re all here to truthseek’. See tensions in truthseeking for some [possibly out of date] thoughts on mine on that)
I’ve noted issues with Public Archipelago given current technologies, but it still seems like the best solution to me.
It seems pretty fucked up to take positive proposals at face value given that context.