Your specific examples of Eliezer, me, or my best (and N best, until 4) friend(s) are contributing to the friendly AI problem, which has way, way bigger number of average affected beings than a mere 3^^^3. If you somehow remove those considerations (maybe by the entity making the offer increasing someone else’s ability to contribute to an equal degree), or instead mention any family member or fifth+ best friend, then my obvious snap judgement is of coarse I’d chose the torture/death, without hesitation, and am confused and disturbed by how anyone could answer otherwise.
I’m planing on possibly doing much worse things to myself if it becomes necessary, which it might. Or to other people if I can figure out eliezers ethical injunction stuff well enough to safely make exceptions and that information tells me to.
Might wanna do a SAN check there—you just claimed that in real life, FAI would affect a way way bigger number than 3^^^3. If that wasn’t a typo, then you don’t seem to know what that number is.
I agree with you a lot, but would still like to raise a counterpoint. To illustrate the problem with mathematical calculations involving truly big numbers though, what would you regard as the probability that some contortion of this universe’s laws allows for literally infinite computation? I don’t give it a particularly high probability at all, but I couldn’t in any honesty assign it one anywhere near 1/3^^^3. The naive expected number of minds FAI affects (effects?) doesn’t even converge in that case, which at least for me is a little problematic
Yes, if he had said “I think there is a small-but-reasonable probability that FAI could affect way way more than 3^^^3 people”, I wouldn’t have had a problem with that (modulo certain things about how big that probability is).
That’s what I DID say, “average”, and my reasoning is roughly the the same as wanderingsouls, except I don’t consider it to be any kind of problem. The omega point, triggered inflation creating child universes, many other things we haven’t even thought about… I’d estimate the probability that FAI will find practically infinite computational power around 10% or so.
And yea if I had chosen the working myself I’d probably have chosen something a bit more humble that I actually can comprehend, like a gogolplex, but 3^^^3 is the standard “incomprehensibly large number” used here, and I’m just using it to mean “would be infinite if we could assume transfinite induction”.
Your specific examples of Eliezer, me, or my best (and N best, until 4) friend(s) are contributing to the friendly AI problem, which has way, way bigger number of average affected beings than a mere 3^^^3. If you somehow remove those considerations (maybe by the entity making the offer increasing someone else’s ability to contribute to an equal degree), or instead mention any family member or fifth+ best friend, then my obvious snap judgement is of coarse I’d chose the torture/death, without hesitation, and am confused and disturbed by how anyone could answer otherwise.
I’m planing on possibly doing much worse things to myself if it becomes necessary, which it might. Or to other people if I can figure out eliezers ethical injunction stuff well enough to safely make exceptions and that information tells me to.
?????????
Well, assuming they exist, in order to be dustspecked …
Might wanna do a SAN check there—you just claimed that in real life, FAI would affect a way way bigger number than 3^^^3. If that wasn’t a typo, then you don’t seem to know what that number is.
I agree with you a lot, but would still like to raise a counterpoint. To illustrate the problem with mathematical calculations involving truly big numbers though, what would you regard as the probability that some contortion of this universe’s laws allows for literally infinite computation? I don’t give it a particularly high probability at all, but I couldn’t in any honesty assign it one anywhere near 1/3^^^3. The naive expected number of minds FAI affects (effects?) doesn’t even converge in that case, which at least for me is a little problematic
Yes, if he had said “I think there is a small-but-reasonable probability that FAI could affect way way more than 3^^^3 people”, I wouldn’t have had a problem with that (modulo certain things about how big that probability is).
Well, small-but-reasonable times infinite equals infinite. Which is indeed way, way bigger than 3^^^3.
That’s what I DID say, “average”, and my reasoning is roughly the the same as wanderingsouls, except I don’t consider it to be any kind of problem. The omega point, triggered inflation creating child universes, many other things we haven’t even thought about… I’d estimate the probability that FAI will find practically infinite computational power around 10% or so.
And yea if I had chosen the working myself I’d probably have chosen something a bit more humble that I actually can comprehend, like a gogolplex, but 3^^^3 is the standard “incomprehensibly large number” used here, and I’m just using it to mean “would be infinite if we could assume transfinite induction”.
Ack! Sorry, I must have missed the ‘average’. Retracted.