One problem is that ‘you’ that can be affected by things that you expect to interact with in the future is in principle no different from those space colonists that are sent out. You can’t interact with future-you. All decisions that we are making form the future with which we don’t directly interact. Future-you is just a result of one more ‘default’ manufacturing process, where laws of physics ensure that there is a physical structure very similar to what was in the past. Hunger is a drive that makes you ‘manufacture’ a fed-future-you, compassion is a drive that makes you ‘manufacture’ a good-feeling-other-person, and so on.
I don’t see any essential difference between decisions that produce ‘observable’ effect and those that produce ‘invisible’ effect. What makes you value some of the future states and not others is your makeup, ‘thousand shards of desire’ as Eliezer put it, and among these things there might as well be those that imply value for physical states that don’t interact with decision-maker’s body.
If I put a person in a black box, and program it to torture that person for 50 years, and then automatically destroy all evidence, so that no tortured-person state can ever be observed, isn’t it as ‘invisible’ as sending a photon away? I know that person is being tortured, and likewise I know that photon is flying away, but I can’t interact with either of them. And yet I assign a distinct negative value to invisible-torture box. It’s one of the stronger drives inbuilt in me.
One problem is that ‘you’ that can be affected by things that you expect to interact with in the future is in principle no different from those space colonists that are sent out. You can’t interact with future-you. All decisions that we are making form the future with which we don’t directly interact. Future-you is just a result of one more ‘default’ manufacturing process, where laws of physics ensure that there is a physical structure very similar to what was in the past. Hunger is a drive that makes you ‘manufacture’ a fed-future-you, compassion is a drive that makes you ‘manufacture’ a good-feeling-other-person, and so on.
I don’t see any essential difference between decisions that produce ‘observable’ effect and those that produce ‘invisible’ effect. What makes you value some of the future states and not others is your makeup, ‘thousand shards of desire’ as Eliezer put it, and among these things there might as well be those that imply value for physical states that don’t interact with decision-maker’s body.
If I put a person in a black box, and program it to torture that person for 50 years, and then automatically destroy all evidence, so that no tortured-person state can ever be observed, isn’t it as ‘invisible’ as sending a photon away? I know that person is being tortured, and likewise I know that photon is flying away, but I can’t interact with either of them. And yet I assign a distinct negative value to invisible-torture box. It’s one of the stronger drives inbuilt in me.