Well, I definitely agree that we should make non-super intelligent AIs for study, and also for a great many other reasons. But it’s perhaps less clear what ‘too stupid to foom’ actually means for an AGI. There was a moment when a hominid brain crossed an invisible line and civilization became possible; but the mutation precipitating that change may not have obviously been a major event from the perspective of an outside observer. It may just have looked like another in a sequence of iterative steps. Is the foom line in about the same place as the agriculture line? Is it simpler? Harder?
On the other hand, it’s possible to imagine an experimental AGI with values like “Fulfill [utility function X] in the strictly defined spatial domain of Neptune, using only materials that were contained in the gravity well of Neptune in the year 2000, including the construction of your own brain, and otherwise avoid >epsilon changes to probable outcomes for the universe outside the domain of Neptune.” Then fill in whatever utility function you’d like to test; you could try this with each new iteration of AGI methodology, once you are actionably worried about the possibility of fooming.
Well, I definitely agree that we should make non-super intelligent AIs for study, and also for a great many other reasons. But it’s perhaps less clear what ‘too stupid to foom’ actually means for an AGI. There was a moment when a hominid brain crossed an invisible line and civilization became possible; but the mutation precipitating that change may not have obviously been a major event from the perspective of an outside observer. It may just have looked like another in a sequence of iterative steps. Is the foom line in about the same place as the agriculture line? Is it simpler? Harder?
On the other hand, it’s possible to imagine an experimental AGI with values like “Fulfill [utility function X] in the strictly defined spatial domain of Neptune, using only materials that were contained in the gravity well of Neptune in the year 2000, including the construction of your own brain, and otherwise avoid >epsilon changes to probable outcomes for the universe outside the domain of Neptune.” Then fill in whatever utility function you’d like to test; you could try this with each new iteration of AGI methodology, once you are actionably worried about the possibility of fooming.