I was thinking about my p(doom) in the next 10 years and came up with something around 6%[1]. However that involves lots of current unknowns to me, like the nature of current human knowledge production (and the bottle necks involved) which impact my P(doom) to be either 3% or 15% depending upon what type of bottle necks are found or not found. Is there a technical way to describe this probability distribution contingent on evidence?
I’m bearish on LLMs leading AI directly (10% chance) and roughly a 30% chance of LLMs based AI fooming quickly enough to kill us and to want to kill us within 10 years. There is a 3% chance that something will come out of left field and doing the same.
[Question] What is the best way to talk about probabilities you expect to change with evidence/experiments?
I was thinking about my p(doom) in the next 10 years and came up with something around 6%[1]. However that involves lots of current unknowns to me, like the nature of current human knowledge production (and the bottle necks involved) which impact my P(doom) to be either 3% or 15% depending upon what type of bottle necks are found or not found. Is there a technical way to describe this probability distribution contingent on evidence?
I’m bearish on LLMs leading AI directly (10% chance) and roughly a 30% chance of LLMs based AI fooming quickly enough to kill us and to want to kill us within 10 years. There is a 3% chance that something will come out of left field and doing the same.