Right; that’s what happens by default. But if you find that because your future self will want to keep its new values, you’re overly reluctant to take useful actions that change your values as a side effect, you might want to precommit to roll back certain changes; or if you can’t keep track of all the side effects, it’s conceivable you want to turn it into a general habit. I could see this either being a good or bad idea on net.
Errr, how? I am familiar with the practice of precommitment, but most of the ways of creating one for oneself seem to rely on consequences not preferred by one’s values. If one’s values have changed, then, such a precommitment isn’t very helpful.
In the context of the thread we’re not talking about all your values changing, just some subset. Base the precommitment round a value you do not expect to change. Money is a reliable fallback due to it’s fungibility.
This isn’t as reliable as you think. It isn’t often that people change how much importance they attach to money, but it isn’t rare, either. Either way, is there a good way to guarantee that you’ll lose access to money when your values change? That’s tough for an external party to verify when you have an incentive to lie.
This is more reliable than you think. We live in a world where money is convertible to further a very wide range of values.
It doesn’t have to be money. You just need a value that you have no reason to expect will change significantly as a result of exposure to particular ‘dangerous thoughts’. Can you honestly say that you expect exposing yourself to information about sex differences in intelligence will radically alter the relative value of money to you though?
Escrow is the general name for a good way to guarantee that your future self will be bound by your precommitment. Depending on how much money is involved this could be as informal as asking a trusted friend who shares your current values to hold some money for a specified period and promise to donate it to a charity promoting the value you fear may be at risk if they judge you to have abandoned that value.
The whole point of precommitment is that you have leverage over your future self. You can make arrangements of cost and complexity up to the limit your current self values the matter of concern and impose a much greater penalty on your future self in case of breach of contract.
Ultimately I don’t believe this is your true rejection. If you wished you could find ways to make credible precommitments to your current values and then undergo controlled exposure to ‘dangerous thoughts’ but you choose not to. That may be a valid choice from a cost/benefit analysis by your current values but it is not because the alternative is impossible, it is just too expensive for your tastes.
Right; that’s what happens by default. But if you find that because your future self will want to keep its new values, you’re overly reluctant to take useful actions that change your values as a side effect, you might want to precommit to roll back certain changes; or if you can’t keep track of all the side effects, it’s conceivable you want to turn it into a general habit. I could see this either being a good or bad idea on net.
I don’t think you can do this. Your future self, not sharing your values, will have no reason to honor your present self’s precommitment.
Precommitment implies making it expensive or impossible for your future self not to honor your commitment.
Errr, how? I am familiar with the practice of precommitment, but most of the ways of creating one for oneself seem to rely on consequences not preferred by one’s values. If one’s values have changed, then, such a precommitment isn’t very helpful.
In the context of the thread we’re not talking about all your values changing, just some subset. Base the precommitment round a value you do not expect to change. Money is a reliable fallback due to it’s fungibility.
This isn’t as reliable as you think. It isn’t often that people change how much importance they attach to money, but it isn’t rare, either. Either way, is there a good way to guarantee that you’ll lose access to money when your values change? That’s tough for an external party to verify when you have an incentive to lie.
This is more reliable than you think. We live in a world where money is convertible to further a very wide range of values.
It doesn’t have to be money. You just need a value that you have no reason to expect will change significantly as a result of exposure to particular ‘dangerous thoughts’. Can you honestly say that you expect exposing yourself to information about sex differences in intelligence will radically alter the relative value of money to you though?
Escrow is the general name for a good way to guarantee that your future self will be bound by your precommitment. Depending on how much money is involved this could be as informal as asking a trusted friend who shares your current values to hold some money for a specified period and promise to donate it to a charity promoting the value you fear may be at risk if they judge you to have abandoned that value.
The whole point of precommitment is that you have leverage over your future self. You can make arrangements of cost and complexity up to the limit your current self values the matter of concern and impose a much greater penalty on your future self in case of breach of contract.
Ultimately I don’t believe this is your true rejection. If you wished you could find ways to make credible precommitments to your current values and then undergo controlled exposure to ‘dangerous thoughts’ but you choose not to. That may be a valid choice from a cost/benefit analysis by your current values but it is not because the alternative is impossible, it is just too expensive for your tastes.