The problem there is twofold; firstly, a lot of aspects would not necessarily scale up to a smarter system, and it’s sometimes hard to tell what generalizes and what doesn’t. Secondly, it’s very very hard to pinpoint the “intelligence” of a program without running it; if we make one too smart it may be smart/nasty enough to feed us misleading data so that our final AI will not share moral values with humans. It’s what I’d do if some aliens tried to dissect my mind to force their morality on humanity.
firstly, a lot of aspects would not necessarily scale up to a smarter system, and it’s sometimes hard to tell what generalizes and what doesn’t.
I agree, but certainly trying to solve the problem without any hands on knowledge is more difficulty.
Secondly, it’s very very hard to pinpoint the “intelligence” of a program without running it
I agree, there is a risk that the first AGI we build will be intelligent enough to skillfully manipulate us. I think the chances are quite small. I find it difficult to image skipping dog level intelligence and human level intelligence and jumping straight to superhuman intelligence, but it is certainly possible.
The problem there is twofold; firstly, a lot of aspects would not necessarily scale up to a smarter system, and it’s sometimes hard to tell what generalizes and what doesn’t. Secondly, it’s very very hard to pinpoint the “intelligence” of a program without running it; if we make one too smart it may be smart/nasty enough to feed us misleading data so that our final AI will not share moral values with humans. It’s what I’d do if some aliens tried to dissect my mind to force their morality on humanity.
I agree, but certainly trying to solve the problem without any hands on knowledge is more difficulty.
I agree, there is a risk that the first AGI we build will be intelligent enough to skillfully manipulate us. I think the chances are quite small. I find it difficult to image skipping dog level intelligence and human level intelligence and jumping straight to superhuman intelligence, but it is certainly possible.