Apropos of this, the Eliezer-persuading-his-Jailer-to-let-him-out thing was on reddit yesterday. I read through it and today there’s this. Coincidence?
Anyway, I was thinking about the AI Jailer last night, and my thoughts apply to this equally. I am sure Eliezer has thought of this so maybe he has a clear explanation that he can give me: what makes you think there is such a thing as “intelligence” at all? How do we know that what we have is one thing, and not just a bunch of tricks that help us get around in the world?
It seems to me a kind of anthropocentric fallacy, akin to the ancient peoples thinking that the gods were literally giant humans up in the sky. Now we don’t believe that anymore but we still think any superior being must essentially be a giant human, mind-wise.
To give an analogy: imagine a world with no wheels (and maybe no atmosphere so no flight either). The only way to move is through leg-based locomotion. We rank humans in running ability, and some other species fit into this ranking also, but would it make sense to then talk about making an “Artificial Runner” that can out-run all of us, and run to the store to buy us milk? And if the AR is really that fast, how will we control it, given that it can outrun the fastest human runners? Will the AR cause the human species to go extinct by outrunning all the males to mate with the females and replace us with its own offspring?
It might be worth using .00000005% of the world GDP to make sure that the AR is not a threat, especially if modern theories say that it’s likely to be.
Call back with that comment when Running, rather than Intelligence, is what allows you to construct a machine that runs increasingly faster than you intended your artificial runner to run.
Because in a world where running fast leads to additional fastness of running, this thing is going to either destroy your world through kinetic release or break the FTL laws and rewrite the universe backwards to have always been all about running.
Apropos of this, the Eliezer-persuading-his-Jailer-to-let-him-out thing was on reddit yesterday. I read through it and today there’s this. Coincidence?
Anyway, I was thinking about the AI Jailer last night, and my thoughts apply to this equally. I am sure Eliezer has thought of this so maybe he has a clear explanation that he can give me: what makes you think there is such a thing as “intelligence” at all? How do we know that what we have is one thing, and not just a bunch of tricks that help us get around in the world?
It seems to me a kind of anthropocentric fallacy, akin to the ancient peoples thinking that the gods were literally giant humans up in the sky. Now we don’t believe that anymore but we still think any superior being must essentially be a giant human, mind-wise.
To give an analogy: imagine a world with no wheels (and maybe no atmosphere so no flight either). The only way to move is through leg-based locomotion. We rank humans in running ability, and some other species fit into this ranking also, but would it make sense to then talk about making an “Artificial Runner” that can out-run all of us, and run to the store to buy us milk? And if the AR is really that fast, how will we control it, given that it can outrun the fastest human runners? Will the AR cause the human species to go extinct by outrunning all the males to mate with the females and replace us with its own offspring?
It might be worth using .00000005% of the world GDP to make sure that the AR is not a threat, especially if modern theories say that it’s likely to be.
Call back with that comment when Running, rather than Intelligence, is what allows you to construct a machine that runs increasingly faster than you intended your artificial runner to run.
Because in a world where running fast leads to additional fastness of running, this thing is going to either destroy your world through kinetic release or break the FTL laws and rewrite the universe backwards to have always been all about running.