Once that happened, I’d no longer be able to eat chickens. I could apply the same process to all animals, and so by induction I would be unwilling to eat any animal.
This is an interesting way to look at using induction, but I see it more as a willing reprogramming of your brain. In your case, you were able to simulate a case where eating chicken would disgust you (eating a pet) and that gave you impetus to stop eating chicken.
I am a big meat eater. I predict there is a 30-60% chance I would drastically reduce my meat eating if I was forced to run a slaughterhouse for my food, and see the suffering and kill the animals. Every time I wanted meat I’d need to take on the moral burden of killing an animal. If I may try my hand at some pop-historical analysis, I bet this is why past societies frequently held a reverence, often spiritual, with the killing of animals for food.
...And yet, I still eat lots of meat. Probably if someone took me on a tour of kids with malaria in Africa I’d donate more to those charities. Or if I was walked through a Russian sex trafficking brothel, I would support organizations to end those practices. Or if someone made a three hour movie on the tragedy of the homeless person who sleeps by my apartment, documenting their misfortune, I would go out and buy a coat and food and try to help them because it would unlock and develop emotions I don’t currently have.
I sort of know if I went through these simulations it would change my outlook and behavior in life. These are also obviously topics I already am familiar with, but there are surely lots of topics I’m unfamiliar with that would change my view of the world. Of course I can’t have all these experiences, and I’m not sure how I should try to adjust my behavior today on the expectations of how my behaviors would change if I were to have experiences that I’m not going to have, but plausibly could have.
Is it rational for me to eat less meat now, even though I enjoy it and don’t feel guilty, because a plausible counter-factual me who had some experiences I don’t have would tell me to? Or is it rational for me to eat meat because there is no counter-factual me who exists, and as it stands now I enjoy it and don’t feel guilty?
Assuming for the sake of argument that a counterfactual me with a chicken pet would become emotionally attached to it and be unwilling to eat chickens, that still begs the question of whether the counterfactual me is acting rationally or not. Perhaps I’m just recognizing that a counterfactual me is vulnerable to having his thought processes hacked.
I could equally well say “a counterfactual me who was kidnapped at birth and raised by Christians would grow up to be a Christian. So I should be a Christian now.” Or even “a counterfactual me who joined Scientology out of curiosity would be overcome by Scientology’s conditioning and come to actually believe in Scientology, so I should believe in Scientology now.”
We are all finite beings, so “rational” must be finite approximation of rational, so although there is a question about how far to apply systematic rationality to your thinking, that is itself not a question fully answerable by systematic rationality (although you could do a finite approximation of such an answer). Whether or not your thinking is rational here is not at stake so much as asking how much effort do you want to expend on coming up with an answer and what things you are willing to consider as evidence (counterfactual selves, for example) and how you will weigh them.
This is an interesting way to look at using induction, but I see it more as a willing reprogramming of your brain. In your case, you were able to simulate a case where eating chicken would disgust you (eating a pet) and that gave you impetus to stop eating chicken.
I am a big meat eater. I predict there is a 30-60% chance I would drastically reduce my meat eating if I was forced to run a slaughterhouse for my food, and see the suffering and kill the animals. Every time I wanted meat I’d need to take on the moral burden of killing an animal. If I may try my hand at some pop-historical analysis, I bet this is why past societies frequently held a reverence, often spiritual, with the killing of animals for food.
...And yet, I still eat lots of meat. Probably if someone took me on a tour of kids with malaria in Africa I’d donate more to those charities. Or if I was walked through a Russian sex trafficking brothel, I would support organizations to end those practices. Or if someone made a three hour movie on the tragedy of the homeless person who sleeps by my apartment, documenting their misfortune, I would go out and buy a coat and food and try to help them because it would unlock and develop emotions I don’t currently have.
I sort of know if I went through these simulations it would change my outlook and behavior in life. These are also obviously topics I already am familiar with, but there are surely lots of topics I’m unfamiliar with that would change my view of the world. Of course I can’t have all these experiences, and I’m not sure how I should try to adjust my behavior today on the expectations of how my behaviors would change if I were to have experiences that I’m not going to have, but plausibly could have.
Is it rational for me to eat less meat now, even though I enjoy it and don’t feel guilty, because a plausible counter-factual me who had some experiences I don’t have would tell me to? Or is it rational for me to eat meat because there is no counter-factual me who exists, and as it stands now I enjoy it and don’t feel guilty?
Assuming for the sake of argument that a counterfactual me with a chicken pet would become emotionally attached to it and be unwilling to eat chickens, that still begs the question of whether the counterfactual me is acting rationally or not. Perhaps I’m just recognizing that a counterfactual me is vulnerable to having his thought processes hacked.
I could equally well say “a counterfactual me who was kidnapped at birth and raised by Christians would grow up to be a Christian. So I should be a Christian now.” Or even “a counterfactual me who joined Scientology out of curiosity would be overcome by Scientology’s conditioning and come to actually believe in Scientology, so I should believe in Scientology now.”
We are all finite beings, so “rational” must be finite approximation of rational, so although there is a question about how far to apply systematic rationality to your thinking, that is itself not a question fully answerable by systematic rationality (although you could do a finite approximation of such an answer). Whether or not your thinking is rational here is not at stake so much as asking how much effort do you want to expend on coming up with an answer and what things you are willing to consider as evidence (counterfactual selves, for example) and how you will weigh them.