I understand that. I cited a Sequences post that has the word “lies” in the title, but I’m claiming that the mechanism described in the cited posts—that distortions on one topic can spread to both adjacent topics, and to people’s understanding of what reasoning looks like—can apply more generally to distortions that aren’t direct lies.
Omitting information can be a distortion when the information would otherwise be relevant. In “A Rational Argument”, Yudkowsky gives the example of an election campaign manager publishing survey responses from their candidate, but omitting one question which would make their candidate look bad, which Yudkowsky describes as “cross[ing] the line between rationality and rationalization” (!). This is a very high standard—but what made the Sequences so valuable, is that they taught people the counterintuitive idea that this standard exists. I think there’s a lot of value in aspiring to hold one’s public reasoning to that standard.
Not infinite value, of course! If I knew for a fact that Godzilla will destroy the world if I cite a book that I would otherwise would have cited as genuinely relevant, then fine, for the sake of the sake of the world, I can not cite the book.
Maybe we just quantitatively disagree on how tough Godzilla is and how large the costs of distortions are? Maybe you’re happy to throw Sargon of Akkad under the bus, but when Steve Hsu is getting thrown under the bus, I think that’s a serious problem for the future of humanity. I think this is actually worth a fight.
With my own resources and my own name (and a pen name), I’m fighting. If someone else doesn’t want to fight with their name and their resources, I’m happy to listen to suggestions for how people with different risk tolerances can cooperate to not step on each other’s toes! In the case of the shared resource of this website, if the Frontpage/Personal distinction isn’t strong enough, then sure, “This is on our Banned Topics list; take it to /r/TheMotte, you guys” could be another point on the compromise curve. What I would hope for from the people playing the sneaky consequentialist image-management strategy, is that you guys would at least acknowledge that there is a conflict and that you’ve chosen a side.
might fill their opinion vacuum with false claims from elsewhere, or with true claims
Your posts seem to be about what happens if you filter out considerations that don’t go your way. Obviously, yes, that way you can get distortion without saying anything false. But the proposal here is to avoid certain topics and be fully honest about which topics are being avoided. This doesn’t create even a single bit of distortion. A blank canvas is not a distorted map. People can get their maps elsewhere, as they already do on many subjects, and as they will keep having to do regardless, simply because some filtering is inevitable beneath the eye of Sauron. (Distortions caused by misestimation of filtering are going to exist whether the filter has 40% strength or 30% strength. The way to minimize them is to focus on estimating correctly. A 100% strength filter is actually relatively easy to correctly estimate. And having the appearance of a forthright debate creates perverse incentives for people to distort their beliefs so they can have something inoffensive to be forthright about.)
The people going after Steve Hsu almost entirely don’t care whether LW hosts Bell Curve reviews. If adjusting allowable topic space gets us 1 util and causes 2 utils of damage distributed evenly across 99 Sargons and one Steve Hsu, that’s only 0.02 Hsu utils lost, which seems like a good trade.
I don’t have a lot of verbal energy and find the “competing grandstanding walls of text” style of discussion draining, and I don’t think the arguments I’m making are actually landing for some reason, and I’m on the verge of tapping out. Generating and posting an IM chat log could be a lot more productive. But people all seem pretty set in their opinions, so it could just be a waste of energy.
I understand that. I cited a Sequences post that has the word “lies” in the title, but I’m claiming that the mechanism described in the cited posts—that distortions on one topic can spread to both adjacent topics, and to people’s understanding of what reasoning looks like—can apply more generally to distortions that aren’t direct lies.
Omitting information can be a distortion when the information would otherwise be relevant. In “A Rational Argument”, Yudkowsky gives the example of an election campaign manager publishing survey responses from their candidate, but omitting one question which would make their candidate look bad, which Yudkowsky describes as “cross[ing] the line between rationality and rationalization” (!). This is a very high standard—but what made the Sequences so valuable, is that they taught people the counterintuitive idea that this standard exists. I think there’s a lot of value in aspiring to hold one’s public reasoning to that standard.
Not infinite value, of course! If I knew for a fact that Godzilla will destroy the world if I cite a book that I would otherwise would have cited as genuinely relevant, then fine, for the sake of the sake of the world, I can not cite the book.
Maybe we just quantitatively disagree on how tough Godzilla is and how large the costs of distortions are? Maybe you’re happy to throw Sargon of Akkad under the bus, but when Steve Hsu is getting thrown under the bus, I think that’s a serious problem for the future of humanity. I think this is actually worth a fight.
With my own resources and my own name (and a pen name), I’m fighting. If someone else doesn’t want to fight with their name and their resources, I’m happy to listen to suggestions for how people with different risk tolerances can cooperate to not step on each other’s toes! In the case of the shared resource of this website, if the Frontpage/Personal distinction isn’t strong enough, then sure, “This is on our Banned Topics list; take it to /r/TheMotte, you guys” could be another point on the compromise curve. What I would hope for from the people playing the sneaky consequentialist image-management strategy, is that you guys would at least acknowledge that there is a conflict and that you’ve chosen a side.
For more on why I think not-making-false-claims is vastly too low of a standard to aim for, see “Firming Up Not-Lying Around Its Edge-Cases Is Less Broadly Useful Than One Might Initially Think” and “Heads I Win, Tails?—Never Heard of Her”.
Your posts seem to be about what happens if you filter out considerations that don’t go your way. Obviously, yes, that way you can get distortion without saying anything false. But the proposal here is to avoid certain topics and be fully honest about which topics are being avoided. This doesn’t create even a single bit of distortion. A blank canvas is not a distorted map. People can get their maps elsewhere, as they already do on many subjects, and as they will keep having to do regardless, simply because some filtering is inevitable beneath the eye of Sauron. (Distortions caused by misestimation of filtering are going to exist whether the filter has 40% strength or 30% strength. The way to minimize them is to focus on estimating correctly. A 100% strength filter is actually relatively easy to correctly estimate. And having the appearance of a forthright debate creates perverse incentives for people to distort their beliefs so they can have something inoffensive to be forthright about.)
The people going after Steve Hsu almost entirely don’t care whether LW hosts Bell Curve reviews. If adjusting allowable topic space gets us 1 util and causes 2 utils of damage distributed evenly across 99 Sargons and one Steve Hsu, that’s only 0.02 Hsu utils lost, which seems like a good trade.
I don’t have a lot of verbal energy and find the “competing grandstanding walls of text” style of discussion draining, and I don’t think the arguments I’m making are actually landing for some reason, and I’m on the verge of tapping out. Generating and posting an IM chat log could be a lot more productive. But people all seem pretty set in their opinions, so it could just be a waste of energy.