That number was presented as an example (“e.g.”) - but more importantly, all the numbers in the range you offer here would argue for more AI alignment research! What we need to establish, naively, is that the probability is not super-exponentially low for a choice between ‘inter-galactic civilization’ and ‘extinction of humanity within a century’. That seems easy enough if we can show that nothing in the claim contradicts established knowledge.
I would argue the probability for this choice existing is far in excess of 50%. As examples of background info supporting this: Bayesianism implies that “narrow AI” designs should be compatible on some level; we know the human brain resulted from a series of kludges; and the superior number of neurons within an elephant’s brain is not strictly required for taking over the world. However, that argument is not logically necessary.
(Technically you’d have to deal with Pascal’s Mugging. However, I like Hansonian adjustment as a solution, and e.g. I doubt an adult civilization would deceive its people about the nature of the world.)
That number was presented as an example (“e.g.”) - but more importantly, all the numbers in the range you offer here would argue for more AI alignment research! What we need to establish, naively, is that the probability is not super-exponentially low for a choice between ‘inter-galactic civilization’ and ‘extinction of humanity within a century’. That seems easy enough if we can show that nothing in the claim contradicts established knowledge.
I would argue the probability for this choice existing is far in excess of 50%. As examples of background info supporting this: Bayesianism implies that “narrow AI” designs should be compatible on some level; we know the human brain resulted from a series of kludges; and the superior number of neurons within an elephant’s brain is not strictly required for taking over the world. However, that argument is not logically necessary.
(Technically you’d have to deal with Pascal’s Mugging. However, I like Hansonian adjustment as a solution, and e.g. I doubt an adult civilization would deceive its people about the nature of the world.)