Mr. Yudkowsky says: But does that question really make much difference in day-to-day moral reasoning, if you’re not trying to build a Friendly AI?
Here is a quote that I think speaks to that:
The Singularity is the technological creation of smarter-than-human intelligence. Ask what “smarter-than-human” really means. And as
the basic definition of the Singularity points out, this is exactly the point at which our ability to extrapolate breaks down. We don’t know
because we’re not that smart.-- Eliezer Yudkowsky
As I understand it, it is not possible for a human to design a machine that is “smarter-than-human”, by definition. It is possible, however to design a machine that can design a machine that is smarter than human. According to one of my correspondents, it has already occurred (quoted exactly, grammar and all):
“My current opinion is that the singularity is behind us. The deep discovery is the discovery of the Universal Machine, alias the computer, but we have our nose so close on it that we don’t really realize what is happening. From this, by adding more and more competence to the universal machine, we put it away from his initial “natural” intelligence. I even believe that the greek theologians were in advance, conceptually, on what intelligence his. Intelligence is confused with competence today. It is correct that competence needs intelligence to develop, but competence cannot be universal and it makes the intelligence fading away: it has a negative feedback on intelligence.
So my opinion is that the singularity has already occurred, and, since a longer time, we have abandon the conceptual tools to really appreciate that recent revolution. We are somehow already less smarter than the universal machine, when it is not yet programmed.”
Mr. Yudkowsky says: But does that question really make much difference in day-to-day moral reasoning, if you’re not trying to build a Friendly AI?
Here is a quote that I think speaks to that:
The Singularity is the technological creation of smarter-than-human intelligence. Ask what “smarter-than-human” really means. And as the basic definition of the Singularity points out, this is exactly the point at which our ability to extrapolate breaks down. We don’t know because we’re not that smart.-- Eliezer Yudkowsky
As I understand it, it is not possible for a human to design a machine that is “smarter-than-human”, by definition. It is possible, however to design a machine that can design a machine that is smarter than human. According to one of my correspondents, it has already occurred (quoted exactly, grammar and all):
“My current opinion is that the singularity is behind us. The deep discovery is the discovery of the Universal Machine, alias the computer, but we have our nose so close on it that we don’t really realize what is happening. From this, by adding more and more competence to the universal machine, we put it away from his initial “natural” intelligence. I even believe that the greek theologians were in advance, conceptually, on what intelligence his. Intelligence is confused with competence today. It is correct that competence needs intelligence to develop, but competence cannot be universal and it makes the intelligence fading away: it has a negative feedback on intelligence. So my opinion is that the singularity has already occurred, and, since a longer time, we have abandon the conceptual tools to really appreciate that recent revolution. We are somehow already less smarter than the universal machine, when it is not yet programmed.”
Best regards,
Bruno Marchal IRIDIA-ULB http://iridia.ulb.ac.be/~marchal/