You can ignore for now since I need to work through whether this is still true depending on how we view the source of uncertainty in doubling time. Edit: this explanation is correct afaict and worth looking into.
The parameters for the second log-normal (doubling time at RE-Bench saturation, 10th: 0.5 mo., 90th: 18 mo.) when made equivalent for an inverse gaussian by matching mean and variance (approx. InverseGaussian[7.97413, 1.315]) are implausible. The linked paper highlights that to be representing doubling processes reasonably, the ratio of first to second parameter ought to be << 2/ln(2) (or << (1/(2ln(2)^2))). The failure to match that indicates that the “size hypothesis” of any of the growth processes is violated, indicating that the distribution is no longer modeling uncertainty around such a process.
Ok, so that’s too many functions, what does it mean? In general, it means that our “uncertainty” is actually the main driver of fast timelines now rather than reflecting a lack of knowledge in any way. The distribution is so stretched that the mode and median are wildly smaller than the mean entirely due to the possibility that a random unknown event causes foom, unrelated to the estimated “growth rate” of the process. It’s like cranking up a noise term on a stock market model and being surprised that some companies are estimated to go to the moon tomorrow, then claiming it is due to estimating those stocks as potentially having huge upsides.
There is not a good solution that keeps the model intact (and the same basic issue is that the model is working in domains that are outcomes like time and frequency rather than inputs like time horizons, compute, or effective compute). If one were to use the same mean and scale up the second parameter, the left side of the pdf would collapse, and the mode and median would jump much higher resulting in a much later estimate of SC. That doesn’t mean that’s how to fix the model, but it does indicate fast timelines are incidental to and reflective of other issues in the model.
Apologies just saw this now since we were taking a break! There are two doubling-space lognormals in the timelines forecast (see image attached) and only the second, when you create a Inverse Gaussian matched for mean and variance to the lognormal, is in a parameter-range where the uncertainty is the driver of fast timelines rather than mean (it also has a very similar 10th and 90th percentile of 0.44 months and 18.7 months).
I do think speeding up to the second lognormal is not super well justified, but fine to ignore disagreements on parameter central tendencies (it’s kinda odd to say speeding-up because the mean actually gets slower while the median gets somewhat faster and the sub-median gets wildly faster (5x faster at the 10th percentile)).
I actually think adjusting this will make fast timelines significantly more appealing to people looking into the model because a big “what?” issue for me at least is how much mass implies we already have or are about to have SC in the timelines model, so adjustments that keep the median fairly close but sharply curtail how fast the 10th percentile are in the model would make me update to trust the model more (and thus believe a <2030 SC timeline more).