rounding errors to the total harm, the only situation where you’d talk about them is if you’ve already granted that the existential risks don’t exist
It’s possible to consider relatively irrelevant things, such as everything in ordinary human experience, even when there is an apocalypse on the horizon. The implied contextualizing norm asks for inability to consider them, or at least increases the cost.
It’s possible to consider relatively irrelevant things, such as everything in ordinary human experience, even when there is an apocalypse on the horizon. The implied contextualizing norm asks for inability to consider them, or at least increases the cost.