The OP here. The post was inspired by this interview by Eliezer:
My impression after watching the interview:
Eliezer thinks that the unaligned AGI, if created, will almost certainly kill us all.
Judging by the despondency he expresses in the interview, he feels that the unaligned AGI is about as deadly as a direct shot right in the head from a large-caliber gun. So, at least 99%.
But I can’t read his mind, so maybe my interpretation is incorrect.
The OP here. The post was inspired by this interview by Eliezer:
My impression after watching the interview:
Eliezer thinks that the unaligned AGI, if created, will almost certainly kill us all.
Judging by the despondency he expresses in the interview, he feels that the unaligned AGI is about as deadly as a direct shot right in the head from a large-caliber gun. So, at least 99%.
But I can’t read his mind, so maybe my interpretation is incorrect.