One of the nice things in my work is I can just point to when I think something human is getting in the way. Like, sometimes someone says an idea is a bad idea. If I dig in, sometimes there’s a human reason they say that: they don’t actually think it’s a bad idea, they just don’t think they will like doing the work to make the idea real, or something similar. But those are different things, but it’s important to have a conversation to sort that out and then we can move forward on two topics: is the idea good and why don’t you want to be involved with it.
But in online conversations, especially on LW, people often feel like it’s rude to come after someone’s humanness. If you disagree with an idea, it’s only normatively acceptable to talk about the idea, not about your motivations for disagreeing. Yes, this comes from standards needed to separate ideas from people and is generally useful, but sometimes it gets in the way and covers up the real reason for a disagreement.
For example, maybe someone suggests we should have prediction markets for everything. You say that sounds terrible. But really you had a personal experience with prediction markets where someone posted the question “will you break up with your romantic partner?”, everyone bet “yes”, and then it came true, and now you have it out for prediction markets, but you don’t say that, you just have lots of reasons why prediction markets are a bad idea. But if we only talk about your purported reasons we’ll never get to the heart of the objection!
I think we make a mistake in talking about ideas and forgetting that it’s humans doing the talking. Separating ideas from people does some good: naively not separating them creates all kinds of problems which is why we have this bit of social tech in place! But it also can go too far, and we need to find specific ways to let the humans back into the idea discussions so we can address the sources of the ideas, not just the ideas themselves. Seems relevant to convincing others, uncovering the reasons for your own beliefs, and building consensus about what is true of the world.
Do you have a human story about why sharing stories is self-doxxing? I imagine most stories can be told in a way that doesn’t doxx, especially if you change some details that are irrelevant to the crux.
Some stories aren’t. That being said, many stories are. I would give examples from my own experience on this site, but they are, uh, self-doxing.
especially if you change some details that are irrelevant to the crux.
Most of the issues arise either a) when the crucial details are themselves the details that you have to hide (“How can you be an expert on X given that there’s about a half-dozen people that know X?” is a classic, for instance.), or b) the story in isolation doesn’t leak enough bits of information to self-dox, but when combined with other already-told (and hence irrevocable) stories is enough.
(Remember, you only need ~33 bits of information to uniquely identify an individual[1]. That’s tiny.)
One of the nice things in my work is I can just point to when I think something human is getting in the way. Like, sometimes someone says an idea is a bad idea. If I dig in, sometimes there’s a human reason they say that: they don’t actually think it’s a bad idea, they just don’t think they will like doing the work to make the idea real, or something similar. But those are different things, but it’s important to have a conversation to sort that out and then we can move forward on two topics: is the idea good and why don’t you want to be involved with it.
But in online conversations, especially on LW, people often feel like it’s rude to come after someone’s humanness. If you disagree with an idea, it’s only normatively acceptable to talk about the idea, not about your motivations for disagreeing. Yes, this comes from standards needed to separate ideas from people and is generally useful, but sometimes it gets in the way and covers up the real reason for a disagreement.
For example, maybe someone suggests we should have prediction markets for everything. You say that sounds terrible. But really you had a personal experience with prediction markets where someone posted the question “will you break up with your romantic partner?”, everyone bet “yes”, and then it came true, and now you have it out for prediction markets, but you don’t say that, you just have lots of reasons why prediction markets are a bad idea. But if we only talk about your purported reasons we’ll never get to the heart of the objection!
I think we make a mistake in talking about ideas and forgetting that it’s humans doing the talking. Separating ideas from people does some good: naively not separating them creates all kinds of problems which is why we have this bit of social tech in place! But it also can go too far, and we need to find specific ways to let the humans back into the idea discussions so we can address the sources of the ideas, not just the ideas themselves. Seems relevant to convincing others, uncovering the reasons for your own beliefs, and building consensus about what is true of the world.
An issue with sharing human stories is the juxtaposition between:
Many people are/must be anonymous online.
Sharing human stories is often self-doxing.
Do you have a human story about why sharing stories is self-doxxing? I imagine most stories can be told in a way that doesn’t doxx, especially if you change some details that are irrelevant to the crux.
Some stories aren’t. That being said, many stories are. I would give examples from my own experience on this site, but they are, uh, self-doxing.
Most of the issues arise either a) when the crucial details are themselves the details that you have to hide (“How can you be an expert on X given that there’s about a half-dozen people that know X?” is a classic, for instance.), or b) the story in isolation doesn’t leak enough bits of information to self-dox, but when combined with other already-told (and hence irrevocable) stories is enough.
(Remember, you only need ~33 bits of information to uniquely identify an individual[1]. That’s tiny.)
Although of course this can be more difficult in practice.