In general, I would agree with the above statement (and technically speaking, I have made such trade-offs). But I do want to point out that it’s important to consider what the loss of knowledge/epistemics entails. This is because certain epistemic sacrifices have minimal costs (I’m very confident that giving up FDT for CDT for the next 24 hours won’t affect me at all) and some have unbounded costs (if giving up materialism causes me to abandon cryonics, it’s hard to quantify how large of a blunder that would be). This is especially true of epistemics that allow to you be unboundedly exploited by an adversarial agent.
As a result, even when the absolute value looks positive to me, I’ll still try to avoid this kinds of trade-offs because certain black swans (ie bumping into an adversarial agent that exploits your lack of knowledge about something) make such bets very high risk.
This sounds pretty reasonable to me; it sounds like you’re basically trying to maximize expected value, but don’t always trust your initial intuitions, which seems quite reasonable.
In general, I would agree with the above statement (and technically speaking, I have made such trade-offs). But I do want to point out that it’s important to consider what the loss of knowledge/epistemics entails. This is because certain epistemic sacrifices have minimal costs (I’m very confident that giving up FDT for CDT for the next 24 hours won’t affect me at all) and some have unbounded costs (if giving up materialism causes me to abandon cryonics, it’s hard to quantify how large of a blunder that would be). This is especially true of epistemics that allow to you be unboundedly exploited by an adversarial agent.
As a result, even when the absolute value looks positive to me, I’ll still try to avoid this kinds of trade-offs because certain black swans (ie bumping into an adversarial agent that exploits your lack of knowledge about something) make such bets very high risk.
This sounds pretty reasonable to me; it sounds like you’re basically trying to maximize expected value, but don’t always trust your initial intuitions, which seems quite reasonable.