I have often been bothered by that norm myself, especially on Less Wrong, but it’s not clear what you’re proposing to put in its place. Given the fact that human beings are not even close to the kind of ideal reasoners that Aumann’s theorem applies to, if you state something very far from what other people think, you cannot expect any sudden change in their probability estimate. They are just going to ignore you at best.
If you’re simply saying that people should assume you have reasons, they probably do assume that. But if you say something they think is wrong, they will just assume your reasons are bad ones. It is not clear why or how you can prevent them from doing that, since you probably do the same thing to them.
I have often been bothered by that norm myself, especially on Less Wrong, but it’s not clear what you’re proposing to put in its place. Given the fact that human beings are not even close to the kind of ideal reasoners that Aumann’s theorem applies to, if you state something very far from what other people think, you cannot expect any sudden change in their probability estimate. They are just going to ignore you at best.
If you’re simply saying that people should assume you have reasons, they probably do assume that. But if you say something they think is wrong, they will just assume your reasons are bad ones. It is not clear why or how you can prevent them from doing that, since you probably do the same thing to them.