...What’s the big deal? Don’t precommit to make poor decisions, especially those which leave you vulnerable to acausal (or any) blackmail. And precommit to cancel your precommitment if you learn that it is harmful.
I don’t believe it could work that way. If you don’t precommit and you could have, your next observational moment will likely be of extreme suffering than not. It is rational to precommit, if you can, that’s the whole issue. You are common sensing game theory. You cannot suddenly start choosing which consequences you accept or not from your model of rationality based on hidden intuitions. If you care to explain further your views in light of TDT os related theories, this could be a fruitful discussion (at least for me).
If my response to the situation you described is to precommit to whatever the blackmailer wants, that is what makes the blackmailer want to blackmail me in the first place. If every simulation of me shrugs and flips my blackmailer the bird, the blackmailer has no incentive to punish me. You can escape punishment if you are prepared to flip the bird and accept it before incentivizing it. There may be blackmailers that enforce their ultimatums whether or not you will respond to them, but in that case akrasia doesn’t help.
Yes, because we can’t precommit. That’s one of the points on my post. But might be other reasons, I would assume so. Nevertheless, it seems to me it is still the case that precommitments would make this scenario more likely and that this is undesirable.
...What’s the big deal? Don’t precommit to make poor decisions, especially those which leave you vulnerable to acausal (or any) blackmail. And precommit to cancel your precommitment if you learn that it is harmful.
I don’t believe it could work that way. If you don’t precommit and you could have, your next observational moment will likely be of extreme suffering than not. It is rational to precommit, if you can, that’s the whole issue. You are common sensing game theory. You cannot suddenly start choosing which consequences you accept or not from your model of rationality based on hidden intuitions. If you care to explain further your views in light of TDT os related theories, this could be a fruitful discussion (at least for me).
If my response to the situation you described is to precommit to whatever the blackmailer wants, that is what makes the blackmailer want to blackmail me in the first place. If every simulation of me shrugs and flips my blackmailer the bird, the blackmailer has no incentive to punish me. You can escape punishment if you are prepared to flip the bird and accept it before incentivizing it. There may be blackmailers that enforce their ultimatums whether or not you will respond to them, but in that case akrasia doesn’t help.
And yet we do not experience this. Something is wrong with this thesis.
Yes, because we can’t precommit. That’s one of the points on my post. But might be other reasons, I would assume so. Nevertheless, it seems to me it is still the case that precommitments would make this scenario more likely and that this is undesirable.