Yes: what we learn from trolley problems is that human moral intuitions are absolute crap (technical term). Starting with even the simplest trolley problems, you find that many people have very strong but inconsistent moral intuitions. Others immediately go to a blue screen when presented with a moral problem with any causal complexity. The answer is that trolley problems are primarily system diagnostic tools that identify corrupt software behaving inconsistently.
Back to the object level, the right answer is dependent on other assumptions. Unless someone wants to have claimed to have solved all meta-ethical problems and have the right ethical system, “a right answer” is the correct framing rather than “the right answer,” because the answer is only right in a given ethical framework. Almost any consequentialist system will output “save the most lives/QALYs.”
Yes: what we learn from trolley problems is that human moral intuitions are absolute crap (technical term). Starting with even the simplest trolley problems, you find that many people have very strong but inconsistent moral intuitions. Others immediately go to a blue screen when presented with a moral problem with any causal complexity. The answer is that trolley problems are primarily system diagnostic tools that identify corrupt software behaving inconsistently.
Back to the object level, the right answer is dependent on other assumptions. Unless someone wants to have claimed to have solved all meta-ethical problems and have the right ethical system, “a right answer” is the correct framing rather than “the right answer,” because the answer is only right in a given ethical framework. Almost any consequentialist system will output “save the most lives/QALYs.”