I agree, I have also thought I am not completely sure of the dynamics of the intelligence explosion. I would like to have more concrete footing to figure out what takeoff will look like, as neither fast nor slow are proved.
My intuition however is the opposite. I can’t disprove a slow takeoff, but to me it seems intuitive that there are some “easy” modifications that should take us far beyond human level. Those intuitions, though they could be wrong, are thus:
- I feel like human capability is limited in some obvious ways. If I had more time and energy to focus on interesting problems, I could accomplish WAY more. Most likely most of us get bored, lazy, distracted, or obligated by our responsibilities too much to unlock our full potential. Also, sometimes our thinking gets cloudy. Reminds me a bit of the movie Limitless. Imagine just being a human, but where all the parts of your brain were a well-oiled machine.
- A single AI would not need to solve so many coordination problems which bog down humanity as a whole from acting like a superintelligence.
- AI can scale its search abilities in an embarrassingly parallel way. It can also optimize different functions for different things, like imagine a brain built for scientific research.
Perhaps intelligence is hard and won’t scale much farther than this, but I feel like if you have this, you already have supervillain level intelligence. Maybe not “make us look like ants” intelligence, but enough for domination.
I agree, I have also thought I am not completely sure of the dynamics of the intelligence explosion. I would like to have more concrete footing to figure out what takeoff will look like, as neither fast nor slow are proved.
My intuition however is the opposite. I can’t disprove a slow takeoff, but to me it seems intuitive that there are some “easy” modifications that should take us far beyond human level. Those intuitions, though they could be wrong, are thus:
- I feel like human capability is limited in some obvious ways. If I had more time and energy to focus on interesting problems, I could accomplish WAY more. Most likely most of us get bored, lazy, distracted, or obligated by our responsibilities too much to unlock our full potential. Also, sometimes our thinking gets cloudy. Reminds me a bit of the movie Limitless. Imagine just being a human, but where all the parts of your brain were a well-oiled machine.
- A single AI would not need to solve so many coordination problems which bog down humanity as a whole from acting like a superintelligence.
- AI can scale its search abilities in an embarrassingly parallel way. It can also optimize different functions for different things, like imagine a brain built for scientific research.
Perhaps intelligence is hard and won’t scale much farther than this, but I feel like if you have this, you already have supervillain level intelligence. Maybe not “make us look like ants” intelligence, but enough for domination.