I don’t quite get the argument here; doesn’t anthropic shadow imply we have nothing to worry about (except for maybe hyperexistential risks) since we’re guaranteed to be living in a timeline where humanity survives in the end?
But it doesn’t say we’re guaranteed not to be living in a timeline where humanity doesn’t survive.
If I had a universe copying machine and a doomsday machine, pressed the “universe copy” button 1000 times (for 2¹⁰⁰⁰ universes), then smashed relativistic meteors into Earth in all but one of them… would you call that an ethical issue? I certainly would, even though the inhabitants of the original universe are guaranteed to be living in a timeline where they don’t die horribly from a volcanic apocalypse.
But it doesn’t say we’re guaranteed not to be living in a timeline where humanity doesn’t survive.
If I had a universe copying machine and a doomsday machine, pressed the “universe copy” button 1000 times (for 2¹⁰⁰⁰ universes), then smashed relativistic meteors into Earth in all but one of them… would you call that an ethical issue? I certainly would, even though the inhabitants of the original universe are guaranteed to be living in a timeline where they don’t die horribly from a volcanic apocalypse.