I would suspect it gets increasingly sketchy to characterize 1/8th, 1/16th, etc. ‘units of knowledge towards AI’ as ‘breakthroughs’ in the way I define the term in the post.
Absolutely. It does—eventually. Which is partially my point. The extrapolation looks sound, until suddenly it isn’t.
I take your point that we might get our wires crossed when a given field looks like it’s accelerating, but when we zoom in to only look at that field’s breakthroughs, we find that they are decelerating. It seems important to watch out for this.
I think you may be slightly missing my point.
Once you hit the point that you no longer consider any recent advances breakthroughs, yes, it becomes obvious that you’re decelerating.
But until that point, breakthroughs appear to be accelerating.
And if you’re discretizing into breakthrough / non-breakthrough, you’re ignoring all the warning signs that the trend might not continue.
(To return to my previous example: say we currently consider any one step that’s >=1/16th of a unit of knowledge as a breakthrough, and we’re at t=2.4… we had breakthroughs at t=1, 3⁄2, 11⁄6, 25⁄12, 137⁄60. The rate of breakthroughs are accelerating! And then we hit t=49/20, and no breakthrough. And it either looks like we plateaued, or someone goes ‘no, 1/32nd of advancement should be considered a breakthrough’ and makes another chart of accelerating breakthroughs.)
(Yes, in this example every discovery is half as much knowledge as the last one, which makes it somewhat obvious that things have changed. Power of 0.5 was just chosen because it makes the math simpler. However, all the same issues occur with an power of e.g. 0.99 not 0.5. Just more gradually. Which makes the ‘no, the last advance should be considered a breakthrough too’ argument a whole lot easier to inadvertently accept...)
Absolutely. It does—eventually. Which is partially my point. The extrapolation looks sound, until suddenly it isn’t.
I think you may be slightly missing my point.
Once you hit the point that you no longer consider any recent advances breakthroughs, yes, it becomes obvious that you’re decelerating.
But until that point, breakthroughs appear to be accelerating.
And if you’re discretizing into breakthrough / non-breakthrough, you’re ignoring all the warning signs that the trend might not continue.
(To return to my previous example: say we currently consider any one step that’s >=1/16th of a unit of knowledge as a breakthrough, and we’re at t=2.4… we had breakthroughs at t=1, 3⁄2, 11⁄6, 25⁄12, 137⁄60. The rate of breakthroughs are accelerating! And then we hit t=49/20, and no breakthrough. And it either looks like we plateaued, or someone goes ‘no, 1/32nd of advancement should be considered a breakthrough’ and makes another chart of accelerating breakthroughs.)
(Yes, in this example every discovery is half as much knowledge as the last one, which makes it somewhat obvious that things have changed. Power of 0.5 was just chosen because it makes the math simpler. However, all the same issues occur with an power of e.g. 0.99 not 0.5. Just more gradually. Which makes the ‘no, the last advance should be considered a breakthrough too’ argument a whole lot easier to inadvertently accept...)