It ignores the is-ought discrepancy by assuming that the way morals seem to have evolved is the “truth” of moral reasoning
No? Not sure how do you got that from my post. Like, my point is that morals are baked in solutions to coordination problems between agents with different wants and power levels. Baked into people’s goal systems. Just as “loving your kids” is a desire that was baked in from reproductive fitness pressure. But instead of brains it works on a level of culture. I.e. Adaptation-Executers, not Fitness-Maximizers
I also think it’s tactically unsound—the most common human-group reaction to something that looks like a threat and isn’t already powerful enough to hurt us is extermination.
Eh. I think it’s one of the considerations. Like, it will probably not be that. It’s either ban on everything even remotely related or some chaos when different regulatory systems trying to do stuff.
No? Not sure how do you got that from my post. Like, my point is that morals are baked in solutions to coordination problems between agents with different wants and power levels. Baked into people’s goal systems. Just as “loving your kids” is a desire that was baked in from reproductive fitness pressure. But instead of brains it works on a level of culture. I.e. Adaptation-Executers, not Fitness-Maximizers
Eh. I think it’s one of the considerations. Like, it will probably not be that. It’s either ban on everything even remotely related or some chaos when different regulatory systems trying to do stuff.