Indeed it doesn’t, but making something constrained by human power makes it less powerful and hence potentially less unsafe. Though that’s probably not what Spector wants to do.
Voted back up to zero because this seems true as far as it goes. The problem is that if he succeeds in doing something that has a useful AGI component at all, that makes it a lot more likely (at least according to how my brain models things) that something which doesn’t need a human in the loop will appear soon after, either through a modification of the original system, as a new system designed by the original system, or simply as a new system inspired by the insights from the original system.
Indeed it doesn’t, but making something constrained by human power makes it less powerful and hence potentially less unsafe. Though that’s probably not what Spector wants to do.
Just because humans are involved doesn’t mean that the whole system is constrained by the human element.
Voted back up to zero because this seems true as far as it goes. The problem is that if he succeeds in doing something that has a useful AGI component at all, that makes it a lot more likely (at least according to how my brain models things) that something which doesn’t need a human in the loop will appear soon after, either through a modification of the original system, as a new system designed by the original system, or simply as a new system inspired by the insights from the original system.