I will accept that “AGI-now” proponents should carry the blame for a hypothetical Paperclip apocalypse when Friendliness proponents accept similar blame for an Earth-bound humanity flattened by a rogue asteroid (or leveled by any of the various threats a superintelligence—or, say, the output of a purely human AI research community unburdened by Friendliness worries—might be able to counter. I previously gave Orlov’s petrocollapse as yet another example.)
Actually, I fully intended the implication that he was risking more than his own life. Self-inflicted risks don’t concern me.
Now you’ve got me wondering what the casualty distribution for speeding-induced accidents looks like.
Well if ASCII has his way, there may be one data point at casualty level 6.6 billion …
I will accept that “AGI-now” proponents should carry the blame for a hypothetical Paperclip apocalypse when Friendliness proponents accept similar blame for an Earth-bound humanity flattened by a rogue asteroid (or leveled by any of the various threats a superintelligence—or, say, the output of a purely human AI research community unburdened by Friendliness worries—might be able to counter. I previously gave Orlov’s petrocollapse as yet another example.)