None, if you buy the “we just have a curious itch to understand the most irrelevant orthodoxy you can think of” explanation. But if that’s a valid reason for rationalists to dig into things that are taboo because of their harmful consequences, is it not then also valid for Cade Metz to follow a curious itch to dig into rationalists’ private information?
Well, I don’t understand what that position has to do with doxxing someone. What does obsessively pointing out how a reigning orthodoxy is incorrect have to do with revealing someone’s private info and making it hard for them to do their jobs? The former is socially useful because a lot of orthodoxy’s result in bad policies or cause people to err in their private lives or whatever. The latter mostly isn’t.
Yes, sometimes someone the two co-incide e.g. revealing that the church uses heliocentric models to calculate celesitial movements or watergate or whatever. But that’s quite rare and I note Matz didn’t provide any argument that doxxing scott is like one of those cases.
Consider a counterfacual where Scott in his private life crusading against DEI policies in a visible way. Then people benifitting from those policies may want to know that “there’s this political activist who’s advocating for policies that harm you and the scope of his influence is way bigger than you thought”. Which would clearly be useful info for a decent chunk of readers. Knowing his name would be useful!
Instead, it’s just “we gotta say his name. It’s so obvious, you know?” OK. So what? Who does that help? Why’s the knowledge valuable? I have not seen a good answer to those questions. Or consider: if Matz for some bizarre reason decided to figure out who “Algon” on LW is and wrote an article revealing that I’m X because “it’s true” I’d say that’s a waste of people’s time and a bit of a dick move.
Yes, he should still be allowed to do so, because regulating free-speech well is hard and I’d rather eat the costs than deal with poor regulations. Doesn’t change the dickishness of it.
The former is socially useful because a lot of orthodoxy’s result in bad policies or cause people to err in their private lives or whatever. The latter mostly isn’t.
I think once you get concrete about it in the discourse, this basically translates to “supports racist and sexist policies”, albeit from the perspective of those who are pro these policies.
Let’s take autogynephilia theory as an example of a taboo belief, both because it’s something I’m familiar with and because it’s something Scott Alexander has spoken out against, so we’re not putting Scott Alexander in any uncomfortable position about it.
Autogynephilia theory has become taboo for various reasons. Some people argue that they should still disseminate it because it’s true, even if it doesn’t have any particular policy implications, but of course that leads to paradoxes where those people themselves tend to have privacy and reputation concerns where they’re not happy about having true things about themselves shared publicly.
The alternate argument is on the basis of “a lot of orthodoxy’s result in bad policies or cause people to err in their private lives or whatever”, but when you get specific about what they are mean rather than dismissing it as “or whatever”, it’s something like “autogynephilia theory is important to know so autogynephiles don’t end up thinking they’re trans and transitioning”, which in other language would mean something like “most trans women shouldn’t have transitioned, and we need policies to ensure that fewer transition in the future”. Which is generally considered an anti-trans position!
Now you might say, well, that position is a good position. But that’s a spicy argument to make, so a lot of the time people fall back on “autogynephilia theory is true and we should have a strong presumption in favor of saying true things”.
Now, there’s also the whole “the discourse is not real life” issue, where the people who advocate for some belief might not be representative of the applications of that belief.
I think once you get concrete about it in the discourse, this basically translates to “supports racist and sexist policies”, albeit from the perspective of those who are pro these policies.
That seems basically correct? And also fine. If you think lots of people are making mistakes that will hurt themselves/others/you and you can convince people about this by sharing info, that’s basically fine to me.
I still don’t understand what this has to do with doxxing someone. I suspect we’re talking past each other right now.
but of course that leads to paradoxes where those people themselves tend to have privacy and reputation concerns where they’re not happy about having true things about themselves shared publicly.
What paradoxes, which people, which things? This isn’t a gotcha: I’m just struggling to parse this sentence right now. I can’t think of any concrete examples that fit. Maybe some “there are autogenphyliacs who claim to be trans but aren’t really and they’d be unhappy if this fact was shared because that would harm their reputation”? If that were true, and someone discovered a specific autogenphyliac who thinks they’re not really trans but presents as such and someone outed them, I would call that a dick move.
So I’m not sure what the paradox is. One stab at a potential paradox: a rational agent would come to similair conclusions if you spread the hypotheticaly true info that 99.99% of trans-females are autogenphyliacs, then a rational agent would conclude that any particular trans-woman is really a cis autogenphyliac. Which means you’re basically doxxing them by providing info that would in this world be relevant to societies making decisions about stuff like who’s allowed to compete in women’s sports.
I guess this is true but it also seems like an extreme case to me. Most people aren’t that rational, and depending on the society, are willing to believe others about kinda-unlikely things about themselves. So in a less extreme hypothetical, say 99.99% vs 90%, I can see people believing most supposedly trans women aren’t trans, but belives any specific person who claims they’re a trans-woman.
EDIT: I believe that a signficant fraction of conflicts aren’t mostly mistakes. But even there, the costs of attempts to restrict speech are quite high.
That seems basically correct? And also fine. If you think lots of people are making mistakes that will hurt themselves/others/you and you can convince people about this by sharing info, that’s basically fine to me.
I still don’t understand what this has to do with doxxing someone. I suspect we’re talking past each other right now.
I mean insofar as people insist they’re interested in it for political reasons, it makes sense to distinguish this from the doxxing and say that there’s no legitimate political use for Scott Alexander’s name.
The trouble is that often people de-emphasize their political motivations, as Scott Alexander did when he framed it around the most irrelevant orthodoxy you can think of, that one is simply interested in out of a curious itch. The most plausible motivation I can think of for making this frame is to avoid being associated with the political motivation.
But regardless of the whether that explanation is true, if one says that there’s a strong presumption in sharing truth to the point where people who dig into inconsequential dogmas that are taboo to question because they cover up moral contradictions that people are afraid will cause genocidal harm if unleashed, then it sure seems like this strong presumption in favor of truth also legitimizes mild cases of doxxing.
OK, now I understand the connection to doxing much more clearly. Thank you. To be clear, I do not endorse legalizing a no-doxxing rule.
I still disagree because it didn’t look like Metz had any reason to doxx Scott beyond “just because”. There were no big benifits to readers or any story about why there was no harm done to Scott in spite of his protests.
Whereas if I’m a journalist and encounter someone who says “if you release information about genetic differences in intelligence that will cause a genocide” I can give reasons for why that is unlikely. And I can give reasons for why I the associated common-bundle-of-beliefs-and-values ie. orthodoxy is not inconsequential, that there are likely, large (albeit not genocide large) harms that this orthodoxy is causing.
I mean I’m not arguing Cade Metz should have doxxed Scott Alexander, I’m just arguing that there is a tension between common rationalist ideology that one should have a strong presumption in favor of telling the truth, and that Cade Metz shouldn’t have doxxed Scott Alexander. As far as I can tell, this common rationalist ideology was a cover for spicier views that you have no issue admitting to, so I’m not exactly saying that there’s any contradiction in your vibe. More that there’s a contradiction in Scott Alexander’s (at least at the time of writing Kolmogorov Complicity).
I’m not sure what my own resolution to the paradox/contradiction is. Maybe that the root problem seems to be that people create information to bolster their side in political discourse, rather than to inform their ideology about how to address problems that they care about. In the latter case, creating information does real productive work, but in the former case, information mostly turns into a weapon, which incentivizes creating some of the most cursed pieces of information known to the world.
I’m just arguing that there is a tension between common rationalist ideology that one should have a strong presumption in favor of telling the truth, and that Cade Metz shouldn’t have doxxed Scott Alexander.
His doxing Scott was in an article that also contained lies, lies which made the doxing more harmful. He wouldn’t have just posted Scott’s real name in a context where no lies were involved.
Your argument rests on a false dichotomy. There are definitely other options than ‘wanting to know truth for no reason at all’ and ‘wanting to know truth to support racist policies’. It is at least plausibly the case that beneficial, non-discriminatory policies could result from knowledge currently considered taboo. It could at least be relevant to other things and therefore useful to know!
What plausible benefit is there to knowing Scott’s real name? What could it be relevant to?
People do sometimes make the case that knowing more information about sex and race differences can be helpful for women and black people. It’s a fine approach to make, if one can actually make it work out in practice. My point is just that the other two approaches also exist.
None, if you buy the “we just have a curious itch to understand the most irrelevant orthodoxy you can think of” explanation. But if that’s a valid reason for rationalists to dig into things that are taboo because of their harmful consequences, is it not then also valid for Cade Metz to follow a curious itch to dig into rationalists’ private information?
Well, I don’t understand what that position has to do with doxxing someone. What does obsessively pointing out how a reigning orthodoxy is incorrect have to do with revealing someone’s private info and making it hard for them to do their jobs? The former is socially useful because a lot of orthodoxy’s result in bad policies or cause people to err in their private lives or whatever. The latter mostly isn’t.
Yes, sometimes someone the two co-incide e.g. revealing that the church uses heliocentric models to calculate celesitial movements or watergate or whatever. But that’s quite rare and I note Matz didn’t provide any argument that doxxing scott is like one of those cases.
Consider a counterfacual where Scott in his private life crusading against DEI policies in a visible way. Then people benifitting from those policies may want to know that “there’s this political activist who’s advocating for policies that harm you and the scope of his influence is way bigger than you thought”. Which would clearly be useful info for a decent chunk of readers. Knowing his name would be useful!
Instead, it’s just “we gotta say his name. It’s so obvious, you know?” OK. So what? Who does that help? Why’s the knowledge valuable? I have not seen a good answer to those questions. Or consider: if Matz for some bizarre reason decided to figure out who “Algon” on LW is and wrote an article revealing that I’m X because “it’s true” I’d say that’s a waste of people’s time and a bit of a dick move.
Yes, he should still be allowed to do so, because regulating free-speech well is hard and I’d rather eat the costs than deal with poor regulations. Doesn’t change the dickishness of it.
I think once you get concrete about it in the discourse, this basically translates to “supports racist and sexist policies”, albeit from the perspective of those who are pro these policies.
Let’s take autogynephilia theory as an example of a taboo belief, both because it’s something I’m familiar with and because it’s something Scott Alexander has spoken out against, so we’re not putting Scott Alexander in any uncomfortable position about it.
Autogynephilia theory has become taboo for various reasons. Some people argue that they should still disseminate it because it’s true, even if it doesn’t have any particular policy implications, but of course that leads to paradoxes where those people themselves tend to have privacy and reputation concerns where they’re not happy about having true things about themselves shared publicly.
The alternate argument is on the basis of “a lot of orthodoxy’s result in bad policies or cause people to err in their private lives or whatever”, but when you get specific about what they are mean rather than dismissing it as “or whatever”, it’s something like “autogynephilia theory is important to know so autogynephiles don’t end up thinking they’re trans and transitioning”, which in other language would mean something like “most trans women shouldn’t have transitioned, and we need policies to ensure that fewer transition in the future”. Which is generally considered an anti-trans position!
Now you might say, well, that position is a good position. But that’s a spicy argument to make, so a lot of the time people fall back on “autogynephilia theory is true and we should have a strong presumption in favor of saying true things”.
Now, there’s also the whole “the discourse is not real life” issue, where the people who advocate for some belief might not be representative of the applications of that belief.
That seems basically correct? And also fine. If you think lots of people are making mistakes that will hurt themselves/others/you and you can convince people about this by sharing info, that’s basically fine to me.
I still don’t understand what this has to do with doxxing someone. I suspect we’re talking past each other right now.
What paradoxes, which people, which things? This isn’t a gotcha: I’m just struggling to parse this sentence right now. I can’t think of any concrete examples that fit. Maybe some “there are autogenphyliacs who claim to be trans but aren’t really and they’d be unhappy if this fact was shared because that would harm their reputation”? If that were true, and someone discovered a specific autogenphyliac who thinks they’re not really trans but presents as such and someone outed them, I would call that a dick move.
So I’m not sure what the paradox is. One stab at a potential paradox: a rational agent would come to similair conclusions if you spread the hypotheticaly true info that 99.99% of trans-females are autogenphyliacs, then a rational agent would conclude that any particular trans-woman is really a cis autogenphyliac. Which means you’re basically doxxing them by providing info that would in this world be relevant to societies making decisions about stuff like who’s allowed to compete in women’s sports.
I guess this is true but it also seems like an extreme case to me. Most people aren’t that rational, and depending on the society, are willing to believe others about kinda-unlikely things about themselves. So in a less extreme hypothetical, say 99.99% vs 90%, I can see people believing most supposedly trans women aren’t trans, but belives any specific person who claims they’re a trans-woman.
EDIT: I believe that a signficant fraction of conflicts aren’t mostly mistakes. But even there, the costs of attempts to restrict speech are quite high.
I mean insofar as people insist they’re interested in it for political reasons, it makes sense to distinguish this from the doxxing and say that there’s no legitimate political use for Scott Alexander’s name.
The trouble is that often people de-emphasize their political motivations, as Scott Alexander did when he framed it around the most irrelevant orthodoxy you can think of, that one is simply interested in out of a curious itch. The most plausible motivation I can think of for making this frame is to avoid being associated with the political motivation.
But regardless of the whether that explanation is true, if one says that there’s a strong presumption in sharing truth to the point where people who dig into inconsequential dogmas that are taboo to question because they cover up moral contradictions that people are afraid will cause genocidal harm if unleashed, then it sure seems like this strong presumption in favor of truth also legitimizes mild cases of doxxing.
Michael Bailey tends to insist that it’s bad to speculate about hidden motives that scientists like him might have for his research, yet when he explains his own research, he insists that he should study people’s hidden motivation using only the justification of truth and curiosity.
OK, now I understand the connection to doxing much more clearly. Thank you. To be clear, I do not endorse legalizing a no-doxxing rule.
I still disagree because it didn’t look like Metz had any reason to doxx Scott beyond “just because”. There were no big benifits to readers or any story about why there was no harm done to Scott in spite of his protests.
Whereas if I’m a journalist and encounter someone who says “if you release information about genetic differences in intelligence that will cause a genocide” I can give reasons for why that is unlikely. And I can give reasons for why I the associated common-bundle-of-beliefs-and-values ie. orthodoxy is not inconsequential, that there are likely, large (albeit not genocide large) harms that this orthodoxy is causing.
I mean I’m not arguing Cade Metz should have doxxed Scott Alexander, I’m just arguing that there is a tension between common rationalist ideology that one should have a strong presumption in favor of telling the truth, and that Cade Metz shouldn’t have doxxed Scott Alexander. As far as I can tell, this common rationalist ideology was a cover for spicier views that you have no issue admitting to, so I’m not exactly saying that there’s any contradiction in your vibe. More that there’s a contradiction in Scott Alexander’s (at least at the time of writing Kolmogorov Complicity).
I’m not sure what my own resolution to the paradox/contradiction is. Maybe that the root problem seems to be that people create information to bolster their side in political discourse, rather than to inform their ideology about how to address problems that they care about. In the latter case, creating information does real productive work, but in the former case, information mostly turns into a weapon, which incentivizes creating some of the most cursed pieces of information known to the world.
His doxing Scott was in an article that also contained lies, lies which made the doxing more harmful. He wouldn’t have just posted Scott’s real name in a context where no lies were involved.
Your argument rests on a false dichotomy. There are definitely other options than ‘wanting to know truth for no reason at all’ and ‘wanting to know truth to support racist policies’. It is at least plausibly the case that beneficial, non-discriminatory policies could result from knowledge currently considered taboo. It could at least be relevant to other things and therefore useful to know!
What plausible benefit is there to knowing Scott’s real name? What could it be relevant to?
People do sometimes make the case that knowing more information about sex and race differences can be helpful for women and black people. It’s a fine approach to make, if one can actually make it work out in practice. My point is just that the other two approaches also exist.