Apparently many people just don’t have a mental bin for global risks to humanity, only counting up the casualties to their own tribe and country. Either that or they’re just short-term thinkers.
Eliezer, I certainly worry about global risks to humanity, but I also worry about the “paradoxes” of utilitarian ethics. E.g., would you advocate killing an innocent person if long-term considerations convinced you it would have an 0.00001% chance of saving the human race? I’m pretty sure most people wouldn’t, and if asked to give a reason, might say that they don’t trust anyone to estimate such small probabilities correctly.
Apparently many people just don’t have a mental bin for global risks to humanity, only counting up the casualties to their own tribe and country. Either that or they’re just short-term thinkers.
Eliezer, I certainly worry about global risks to humanity, but I also worry about the “paradoxes” of utilitarian ethics. E.g., would you advocate killing an innocent person if long-term considerations convinced you it would have an 0.00001% chance of saving the human race? I’m pretty sure most people wouldn’t, and if asked to give a reason, might say that they don’t trust anyone to estimate such small probabilities correctly.