I’ve written a post on my blog covering some aspects of AGI and FAI.
It probably has nothing new for most people here, but could still be interesting.
I’ll be happy for feedback—in particular, I can’t remember if my analogy with flight is something I came up with or heard here long ago. Will be happy to hear if it’s novel, and if it’s any good.
The flight analogy, or at least some variation of it, is pretty standard in my experience. (Incidentally, I heard a version of the analogy just recently, when I was reading through the slides of an old university course—see pages 15-19 here.)
I’ve written a post on my blog covering some aspects of AGI and FAI.
It probably has nothing new for most people here, but could still be interesting.
I’ll be happy for feedback—in particular, I can’t remember if my analogy with flight is something I came up with or heard here long ago. Will be happy to hear if it’s novel, and if it’s any good.
How many hardware engineers does it take to develop an artificial general intelligence?
The flight analogy, or at least some variation of it, is pretty standard in my experience. (Incidentally, I heard a version of the analogy just recently, when I was reading through the slides of an old university course—see pages 15-19 here.)