Yes, if he had said “I think there is a small-but-reasonable probability that FAI could affect way way more than 3^^^3 people”, I wouldn’t have had a problem with that (modulo certain things about how big that probability is).
That’s what I DID say, “average”, and my reasoning is roughly the the same as wanderingsouls, except I don’t consider it to be any kind of problem. The omega point, triggered inflation creating child universes, many other things we haven’t even thought about… I’d estimate the probability that FAI will find practically infinite computational power around 10% or so.
And yea if I had chosen the working myself I’d probably have chosen something a bit more humble that I actually can comprehend, like a gogolplex, but 3^^^3 is the standard “incomprehensibly large number” used here, and I’m just using it to mean “would be infinite if we could assume transfinite induction”.
Yes, if he had said “I think there is a small-but-reasonable probability that FAI could affect way way more than 3^^^3 people”, I wouldn’t have had a problem with that (modulo certain things about how big that probability is).
Well, small-but-reasonable times infinite equals infinite. Which is indeed way, way bigger than 3^^^3.
That’s what I DID say, “average”, and my reasoning is roughly the the same as wanderingsouls, except I don’t consider it to be any kind of problem. The omega point, triggered inflation creating child universes, many other things we haven’t even thought about… I’d estimate the probability that FAI will find practically infinite computational power around 10% or so.
And yea if I had chosen the working myself I’d probably have chosen something a bit more humble that I actually can comprehend, like a gogolplex, but 3^^^3 is the standard “incomprehensibly large number” used here, and I’m just using it to mean “would be infinite if we could assume transfinite induction”.
Ack! Sorry, I must have missed the ‘average’. Retracted.