In order to run a brain without first understanding AI, you have to simulate the brain as a physical object.
What reasons are there to believe that we can understand intelligence without understanding the brain first? AIXI is to narrow AI as an universal turing machine is to a modern Intel chip. To produce a modern Intel CPU you need a US$2.5 billion chip factory. To produce something like IBM Watson you need a company with a revenue of US$99.870 billion and 426,751 employees to support it. What reasons do you have to believe that in order to develop artificial general intelligence that is capable of explosive recursive self-improvement you need orders of magnitude less resources than to figure out how the brain works? After all the human brain is the only example of an efficient general intelligence that we have.
Because there aren’t any indications that general intelligence is so narrow a category that we have to copy the brain, so the question is “which is faster—normal AI research starting now, or modeling the brain starting later?” Once the brain is understood to some high degree, in order to base an intelligence off of it you get a cheat sheet for most the the decisions of normal AI research, and you still have to implement it computationally, which will be harder than normal AI research. So I think there’s a good chance, though I’m not certain, that normal AI research will be able to make good on its head start and create a self-improving AI first. Both will be faster than simulating a specific human brain, which is what I said would take orders of magnitude more resources.
Another consideration favoring normal AI over whole brain emulation is that evolution finds local optima. It may be possible to exceed the brain’s effectiveness or efficiency at some intellectual tasks by using a radically different architecture.
Yes, that is about the correct answer to this question. We can see that emulations of scanned brains won’t come first, since they require more advanced technology and understanding to develop. The same situation as with scanning birds—broadly speaking.
What reasons are there to believe that we can understand intelligence without understanding the brain first? AIXI is to narrow AI as an universal turing machine is to a modern Intel chip. To produce a modern Intel CPU you need a US$2.5 billion chip factory. To produce something like IBM Watson you need a company with a revenue of US$99.870 billion and 426,751 employees to support it. What reasons do you have to believe that in order to develop artificial general intelligence that is capable of explosive recursive self-improvement you need orders of magnitude less resources than to figure out how the brain works? After all the human brain is the only example of an efficient general intelligence that we have.
Because there aren’t any indications that general intelligence is so narrow a category that we have to copy the brain, so the question is “which is faster—normal AI research starting now, or modeling the brain starting later?” Once the brain is understood to some high degree, in order to base an intelligence off of it you get a cheat sheet for most the the decisions of normal AI research, and you still have to implement it computationally, which will be harder than normal AI research. So I think there’s a good chance, though I’m not certain, that normal AI research will be able to make good on its head start and create a self-improving AI first. Both will be faster than simulating a specific human brain, which is what I said would take orders of magnitude more resources.
Another consideration favoring normal AI over whole brain emulation is that evolution finds local optima. It may be possible to exceed the brain’s effectiveness or efficiency at some intellectual tasks by using a radically different architecture.
Yes, that is about the correct answer to this question. We can see that emulations of scanned brains won’t come first, since they require more advanced technology and understanding to develop. The same situation as with scanning birds—broadly speaking.
I am not sure what you were going for here—but FWIW, AIXI is pretty general.