TLDR give pigs guns (preferably by enhancing individual baseline pigs, not by breeding new type of smart powerful pig. Otherwise it will probably just be two different cases. More like gene therapy than producing modified fetuses)
As of lately I hold an opinion that morals are proxy to negotiated cooperation or something, I think it clarifies a lot about the dynamics that produce it. It’s like evolutionary selection → human desire to care about family and see their kids prosper, implicit coordination problems between agents of varied power levels → morals.
So, like, uplift could be the best way to ensure that animals are treated well. Just give them power to hurt you and benefit you, and they will be included into moral considerations, after some time for it to shake out. Same stuff with hypothetical p-zombies, they are as powerful as humans, so they will be included. Same with EMs.
Also, “super beneficiaries” are then just powerful beings, don’t bother to research the depth of experience or strength of preferences. (e.g. gods, who can do whatever and don’t abide by their own rules and perceived to be moral, as an example of this dynamics).
Also, pantheon of more human like gods → less perceived power + perceived possibility to play on disagreements → lesser moral status. One powerful god → more perceived power → stronger moral status. Coincidence? I think not.
Modern morals could be driven by a lot stronger social mobility. People have a lot of power now, and can unexpectedly acquire a lot of power later. so, you should be careful with them and visibly commit to treating them well (e.g. be moral person, with particular appropriate type of morals).
And it’s not surprising how (chattel) slaves were denied a claim on being provided with moral considerations (or claim on being a person or whatever), in a strong equilibrium where they are powerless and expected to be powerless later.
tldr give pigs guns
(preferably by enhancing individual baseline pigs, not by breeding new type of smart powerful pig. Otherwise it will probably just be two different cases. More like gene therapy than producing modified fetuses)
I think this is misguided. It ignores the is-ought discrepancy by assuming that the way morals seem to have evolved is the “truth” of moral reasoning. I also think it’s tactically unsound—the most common human-group reaction to something that looks like a threat and isn’t already powerful enough to hurt us is extermination.
I DO think that uplift (of humans and pigs) is a good thing on its own—more intelligence means more of the universe experiencing and modeling itself.
It ignores the is-ought discrepancy by assuming that the way morals seem to have evolved is the “truth” of moral reasoning
No? Not sure how do you got that from my post. Like, my point is that morals are baked in solutions to coordination problems between agents with different wants and power levels. Baked into people’s goal systems. Just as “loving your kids” is a desire that was baked in from reproductive fitness pressure. But instead of brains it works on a level of culture. I.e. Adaptation-Executers, not Fitness-Maximizers
I also think it’s tactically unsound—the most common human-group reaction to something that looks like a threat and isn’t already powerful enough to hurt us is extermination.
Eh. I think it’s one of the considerations. Like, it will probably not be that. It’s either ban on everything even remotely related or some chaos when different regulatory systems trying to do stuff.
TLDR give pigs guns (preferably by enhancing individual baseline pigs, not by breeding new type of smart powerful pig. Otherwise it will probably just be two different cases. More like gene therapy than producing modified fetuses)
As of lately I hold an opinion that morals are proxy to negotiated cooperation or something, I think it clarifies a lot about the dynamics that produce it. It’s like evolutionary selection → human desire to care about family and see their kids prosper, implicit coordination problems between agents of varied power levels → morals.
So, like, uplift could be the best way to ensure that animals are treated well. Just give them power to hurt you and benefit you, and they will be included into moral considerations, after some time for it to shake out. Same stuff with hypothetical p-zombies, they are as powerful as humans, so they will be included. Same with EMs.
Also, “super beneficiaries” are then just powerful beings, don’t bother to research the depth of experience or strength of preferences. (e.g. gods, who can do whatever and don’t abide by their own rules and perceived to be moral, as an example of this dynamics).
Also, pantheon of more human like gods → less perceived power + perceived possibility to play on disagreements → lesser moral status. One powerful god → more perceived power → stronger moral status. Coincidence? I think not.
Modern morals could be driven by a lot stronger social mobility. People have a lot of power now, and can unexpectedly acquire a lot of power later. so, you should be careful with them and visibly commit to treating them well (e.g. be moral person, with particular appropriate type of morals).
And it’s not surprising how (chattel) slaves were denied a claim on being provided with moral considerations (or claim on being a person or whatever), in a strong equilibrium where they are powerless and expected to be powerless later.
tldr give pigs guns
(preferably by enhancing individual baseline pigs, not by breeding new type of smart powerful pig. Otherwise it will probably just be two different cases. More like gene therapy than producing modified fetuses)
I think this is misguided. It ignores the is-ought discrepancy by assuming that the way morals seem to have evolved is the “truth” of moral reasoning. I also think it’s tactically unsound—the most common human-group reaction to something that looks like a threat and isn’t already powerful enough to hurt us is extermination.
I DO think that uplift (of humans and pigs) is a good thing on its own—more intelligence means more of the universe experiencing and modeling itself.
No? Not sure how do you got that from my post. Like, my point is that morals are baked in solutions to coordination problems between agents with different wants and power levels. Baked into people’s goal systems. Just as “loving your kids” is a desire that was baked in from reproductive fitness pressure. But instead of brains it works on a level of culture. I.e. Adaptation-Executers, not Fitness-Maximizers
Eh. I think it’s one of the considerations. Like, it will probably not be that. It’s either ban on everything even remotely related or some chaos when different regulatory systems trying to do stuff.