Me too, but I recognize that I’m much less happy with people applying the reasoning to take away my self-direction and choice. I’m uncomfortable with the elitism that says “I’m better at it, so I follow different rules”, but I don’t know any better answer.
If we change “blow up the world” to “kill a fly” at what point does the confidence start to waiver?
If we change “will blow up” to “maybe blow up” to “might blow up” when does it start to waiver?
Another very edge case comes from Star Control II. The Ur-Quan are of the opinion that having a random sentient species in the universe is a risk that it is a homicidial one or makes a torture world and kills all other life is unacceptable. The two internal factions disagree whether dominating all other species is enough (The Path of Now and Forever) or whether specicide until only Ur-Quan life remains is called for (The Eternal Doctrine). Because of their species history and special makeup they have reason to believe they have enhanced position to understand xenolife risks.
Ruminating on Ur-Quan I came to a position that, yes allowing other species to live (free) does pose a extremely bad outcome risk but this is small compared to the (expected) richness-addition of life. What the Ur-Quan are doing is excessive but if “will they blow up the world?” would auto-warrant an infinite confident yes for outlaw status then their argument would carry through: The only way to make sure is to nuke/enslave (most of) the world.
I guess in more human scale: Having bats around means they might occasionally serve as jumping off points for pretty nasty viruses. The mere possiblity of this is not enough to jump to the conclusion that bats should be made extinct. And for human positions in organizations the fact that it is filled with a human and thus being fallible doesn’t mean they are inadmissible to exercise any of their powers.
I’m happy with a confident “yes” to that last question.
Me too, but I recognize that I’m much less happy with people applying the reasoning to take away my self-direction and choice. I’m uncomfortable with the elitism that says “I’m better at it, so I follow different rules”, but I don’t know any better answer.
If we change “blow up the world” to “kill a fly” at what point does the confidence start to waiver?
If we change “will blow up” to “maybe blow up” to “might blow up” when does it start to waiver?
Another very edge case comes from Star Control II. The Ur-Quan are of the opinion that having a random sentient species in the universe is a risk that it is a homicidial one or makes a torture world and kills all other life is unacceptable. The two internal factions disagree whether dominating all other species is enough (The Path of Now and Forever) or whether specicide until only Ur-Quan life remains is called for (The Eternal Doctrine). Because of their species history and special makeup they have reason to believe they have enhanced position to understand xenolife risks.
Ruminating on Ur-Quan I came to a position that, yes allowing other species to live (free) does pose a extremely bad outcome risk but this is small compared to the (expected) richness-addition of life. What the Ur-Quan are doing is excessive but if “will they blow up the world?” would auto-warrant an infinite confident yes for outlaw status then their argument would carry through: The only way to make sure is to nuke/enslave (most of) the world.
I guess in more human scale: Having bats around means they might occasionally serve as jumping off points for pretty nasty viruses. The mere possiblity of this is not enough to jump to the conclusion that bats should be made extinct. And for human positions in organizations the fact that it is filled with a human and thus being fallible doesn’t mean they are inadmissible to exercise any of their powers.