In general, I don’t fully agree with rationalist culture about what is demanded by honesty. Like that Leverage example doesn’t sound obviously bad to me— maybe they just don’t want to promote Leverage or confuse anyone about their position on Leverage instead of creating a historical record, as you seem to take to be the only legitimate goal? (Unless you mean the most recent EA Global in which case that would seem more like a cover-up.)
The advantage of pre-commitment virtue signals is that you don’t have to interpret them through the lens of your values to know whether the person fulfilled them or not. Most virtue signals depend on whether you agree the thing is a virtue, though, and when you have a very specific flavor of a virtue like honesty then that becomes ingroup v neargroup-defining.
Honesty isn’t just a virtue. When it comes to trusting people, signals of honesty mean that you can take what someone is saying at face value. It allows you to trust people to not mislead you. This is why focusing on whether signals are virtuous, can be misleading when you want to make decisions about trusting.
Editing pictures that you publish on your own website to remove uncomfortable information, is worse than just not speaking about certain information. It would be possible to simply not publish the photo. Deciding to edit it to remove information is a conscious choice that’s a signal.
Editing pictures that you publish on your own website to remove uncomfortable information, is worse than just not speaking about certain information. It would be possible to simply not publish the photo. Deciding to edit it to remove information is a conscious choice that’s a signal.
I don’t know this full situation or what I would conclude about it but I don’t think your interpretation is QED on its face. Like I said, I feel like it is potentially more dishonest or misleading to seem to endorse Leverage. Idk why they didn’t just not post the pictures at all, which seems the least potentially confusing or deceptive, but the fact that they didn’t doesn’t lead me to conclude dishonesty without knowing more.
I actually think LWers tend toward the bad kind of virtue signaling with honesty, and they tend to define honesty as not doing themselves any favors with communication. (Makes sense considering Hanson’s foundational influence.)
In general, I don’t fully agree with rationalist culture about what is demanded by honesty. Like that Leverage example doesn’t sound obviously bad to me— maybe they just don’t want to promote Leverage or confuse anyone about their position on Leverage instead of creating a historical record, as you seem to take to be the only legitimate goal? (Unless you mean the most recent EA Global in which case that would seem more like a cover-up.)
The advantage of pre-commitment virtue signals is that you don’t have to interpret them through the lens of your values to know whether the person fulfilled them or not. Most virtue signals depend on whether you agree the thing is a virtue, though, and when you have a very specific flavor of a virtue like honesty then that becomes ingroup v neargroup-defining.
Honesty isn’t just a virtue. When it comes to trusting people, signals of honesty mean that you can take what someone is saying at face value. It allows you to trust people to not mislead you. This is why focusing on whether signals are virtuous, can be misleading when you want to make decisions about trusting.
Editing pictures that you publish on your own website to remove uncomfortable information, is worse than just not speaking about certain information. It would be possible to simply not publish the photo. Deciding to edit it to remove information is a conscious choice that’s a signal.
I don’t know this full situation or what I would conclude about it but I don’t think your interpretation is QED on its face. Like I said, I feel like it is potentially more dishonest or misleading to seem to endorse Leverage. Idk why they didn’t just not post the pictures at all, which seems the least potentially confusing or deceptive, but the fact that they didn’t doesn’t lead me to conclude dishonesty without knowing more.
I actually think LWers tend toward the bad kind of virtue signaling with honesty, and they tend to define honesty as not doing themselves any favors with communication. (Makes sense considering Hanson’s foundational influence.)