The context here is of an aspiring rationalist trying to consciously plan and follow a complete social strategy, and rejecting their more basic intuitions about how they should act in favor of their consequentialist calculus. This sort of conscious engineering often fails spectacularly, as I can attest. (The usual exceptions are heuristics that have been tested and passed on by others, and are more likely to succeed not because of their rational appeal relative to other suggestions but rather because of their optimization by selection.)
Then they are reaching out too much, using the tool incorrectly, confusing themselves instead of fixing the problems. Note that conscious planning is also mostly intuition, not expected utility maximization, and you’ve just magnified on the incoherence of the practice of applying it where the consequence of such act is failure, while the goal is success.
The context here is of an aspiring rationalist trying to consciously plan and follow a complete social strategy, and rejecting their more basic intuitions about how they should act in favor of their consequentialist calculus. This sort of conscious engineering often fails spectacularly, as I can attest. (The usual exceptions are heuristics that have been tested and passed on by others, and are more likely to succeed not because of their rational appeal relative to other suggestions but rather because of their optimization by selection.)
Then they are reaching out too much, using the tool incorrectly, confusing themselves instead of fixing the problems. Note that conscious planning is also mostly intuition, not expected utility maximization, and you’ve just magnified on the incoherence of the practice of applying it where the consequence of such act is failure, while the goal is success.