Venu: Given the tiny minority of AIs that will FOOM at all, what is the probability that an AI which has been designed for a purpose other than FOOMing, will instead FOOM?
It seems to me like a pretty small probability that an AI not designed to self-improve will be the first AI that goes FOOM, when there are already many parties known to me who would like to deliberately cause such an event.
Why not anti-predict that no AIs will FOOM at all?
A reasonable question from the standpoint of antiprediction; here you would have to refer back to the articles on cascades, recursion, the article on hard takeoff, etcetera.
Re Tim’s “suddenly develop the ability reprogram and improve themselves all-at-once”—the issue is whether something happens efficiently enough to be local or fast enough to accumulate advantage between the leading Friendly AI and the leading unFriendly AI, not whether things can happen with zero resource or instantaneously. But the former position seems to be routinely distorted into the straw latter.
It seems to me like a pretty small probability that an AI not designed to self-improve will be the first AI that goes FOOM, when there are already many parties known to me who would like to deliberately cause such an event.
I know this is four years old, but this seems like a damn good time to “shut up and multiply” (thanks for that thoughtmeme by the way).
It seems to me like a pretty small probability that an AI not designed to self-improve will be the first AI that goes FOOM, when there are already many parties known to me who would like to deliberately cause such an event.
A reasonable question from the standpoint of antiprediction; here you would have to refer back to the articles on cascades, recursion, the article on hard takeoff, etcetera.
Re Tim’s “suddenly develop the ability reprogram and improve themselves all-at-once”—the issue is whether something happens efficiently enough to be local or fast enough to accumulate advantage between the leading Friendly AI and the leading unFriendly AI, not whether things can happen with zero resource or instantaneously. But the former position seems to be routinely distorted into the straw latter.
I know this is four years old, but this seems like a damn good time to “shut up and multiply” (thanks for that thoughtmeme by the way).