Can you explain how Events #1-5 from your list are not correlated?
For instance, I’d guess #2 (learns faster than humans) follows naturally—or is much more likely—if #1 (algos for transformative AI) comes to pass. Similarly, #3 (inference costs <$25/hr) seems to me a foregone conclusion if #5 (massive chip/power scale) and #2 happen.
Treating the first five as conditionally independent puts you at 1% before arriving at 0.4% with external derailments, so it’s doing most of the work to make your final probability miniscule. But I suspect they are highly correlated events and would bet a decent chunk of money (at 100:1 odds, at least) that all five come to pass.
Thanks, I suppose I’m taking issue with sequencing five distinct conditional events that seem to be massively correlated with one another. The likelihoods of Events 1-5 seem to depend upon each other in ways such that you cannot assume point probabilities for each event and multiply them together to arrive at 1%. Event 5 certainly doesn’t require Events 1-4 as a prerequisite, and arguably makes Events 1-4 much more likely if it comes to pass.
This doesn’t depend on A happening chronologically before or after B etc., it’s a true mathematical identity regardless.
This doesn’t depend on these things being uncorrelated. The formula is true even in the extreme case where two or more of these things are 100% perfectly correlated. (…In which case one or more of the factors on the right are going to be 1.0.)
You’re entitled to argue that P(TAI)>P(A&B&C&⋯), and you’re entitled to argue that people are assigning conditional probabilities in a wrong and confused way for whatever reason (e.g. see discussion here), but you can’t argue with the mathematical identity, right?
Apologies, I’m not trying to dispute math identities. And thank you, the link provided helps put words to my gut concern: that this essay’s conclusion relies heavily on a multi-stage fallacy, and arriving at point probability estimates for each event independently is fraught/difficult.
Can you explain how Events #1-5 from your list are not correlated?
For instance, I’d guess #2 (learns faster than humans) follows naturally—or is much more likely—if #1 (algos for transformative AI) comes to pass. Similarly, #3 (inference costs <$25/hr) seems to me a foregone conclusion if #5 (massive chip/power scale) and #2 happen.
Treating the first five as conditionally independent puts you at 1% before arriving at 0.4% with external derailments, so it’s doing most of the work to make your final probability miniscule. But I suspect they are highly correlated events and would bet a decent chunk of money (at 100:1 odds, at least) that all five come to pass.
They state that their estimated probability for each event is conditional on all previous events happening.
Thanks, I suppose I’m taking issue with sequencing five distinct conditional events that seem to be massively correlated with one another. The likelihoods of Events 1-5 seem to depend upon each other in ways such that you cannot assume point probabilities for each event and multiply them together to arrive at 1%. Event 5 certainly doesn’t require Events 1-4 as a prerequisite, and arguably makes Events 1-4 much more likely if it comes to pass.
It’s a mathematical identity that
P(A&B&C&D&E)=P(A)P(B|A)P(C|A,B)P(D|A,B,C)P(E|A,B,C,D)
This doesn’t depend on A happening chronologically before or after B etc., it’s a true mathematical identity regardless.
This doesn’t depend on these things being uncorrelated. The formula is true even in the extreme case where two or more of these things are 100% perfectly correlated. (…In which case one or more of the factors on the right are going to be 1.0.)
You’re entitled to argue that P(TAI)>P(A&B&C&⋯), and you’re entitled to argue that people are assigning conditional probabilities in a wrong and confused way for whatever reason (e.g. see discussion here), but you can’t argue with the mathematical identity, right?
Apologies, I’m not trying to dispute math identities. And thank you, the link provided helps put words to my gut concern: that this essay’s conclusion relies heavily on a multi-stage fallacy, and arriving at point probability estimates for each event independently is fraught/difficult.