I don’t think many people hold the view you’re attacking. People are very aware of the risks of self-deception. There’s a reason it’s considered a part of the so-called Dark Arts.
Cheers for comment. I think I perhaps should have made the river self-deception less deliberate, to create a greater link between it and the “winning” mentality. I guess I’m suggesting that there is a little inevitable self-deception incurred in the “systematised winning” and general “truth isn’t everything” attitudes that I’ve run into so far in my LW experience. Several people have straight-up told me truth is only incidental in the common LWers approach to instrumental rationality, though I can see there are a range of views.
The truth indeed is only incidental, pretty much by the definition of instrumental rationality when truth isn’t your terminal goal. But surely the vast majority agree that the truth is highly instrumentally valuable for almost all well-behaved goals? Finding out the truth is pretty much a textbook example of an instrumental goal which very diverse intelligences would converge to.
I guess my argument is that when people can’t see an immediate utility for the truth, they can become lazy or rationalise that a self-deception is acceptable. This occurs because truth is seen as useful rather than essential or at least essential in all but the most extreme circumstances. I think this approach is present in the “truth isn’t everything” interpretation of instrumental rationality. The systematised winning isn’t intended to comprise this kind of interpretation, but I think the words it uses evokes too much that’s tied into a problematic engagement with the truth. That’s where I currently sit on the topic in any case.
I don’t think many people hold the view you’re attacking. People are very aware of the risks of self-deception. There’s a reason it’s considered a part of the so-called Dark Arts.
Cheers for comment. I think I perhaps should have made the river self-deception less deliberate, to create a greater link between it and the “winning” mentality. I guess I’m suggesting that there is a little inevitable self-deception incurred in the “systematised winning” and general “truth isn’t everything” attitudes that I’ve run into so far in my LW experience. Several people have straight-up told me truth is only incidental in the common LWers approach to instrumental rationality, though I can see there are a range of views.
The truth indeed is only incidental, pretty much by the definition of instrumental rationality when truth isn’t your terminal goal. But surely the vast majority agree that the truth is highly instrumentally valuable for almost all well-behaved goals? Finding out the truth is pretty much a textbook example of an instrumental goal which very diverse intelligences would converge to.
I guess my argument is that when people can’t see an immediate utility for the truth, they can become lazy or rationalise that a self-deception is acceptable. This occurs because truth is seen as useful rather than essential or at least essential in all but the most extreme circumstances. I think this approach is present in the “truth isn’t everything” interpretation of instrumental rationality. The systematised winning isn’t intended to comprise this kind of interpretation, but I think the words it uses evokes too much that’s tied into a problematic engagement with the truth. That’s where I currently sit on the topic in any case.