Agreed, I don’t see why the mind isn’t a type of “computer”, and why living organisms aren’t “machines”. If there was something truly different and special about being organic, then we could just build an organic AI. I don’t get the distinction being made.
(Phase 2)
technological artifacts that have no possibility of empathy, compassion or understanding.
Oh: sounds like dualism of some kind if it is impossible for a machine to have empathy, compassion or understanding. Meaning beings with these qualities are more than physical machines, somehow.
(Phase 3)
Reading through some of the comments to the article, it sounds like the objection isn’t that intelligence is necessarily non-physical, but that “computation” doesn’t encompass all possible physical activity. I guess the idea is that if reality is continuous, then there could be some kind of complexity gap between discrete computation and an organic process.
Phases 1-3 is the sequential steps I’ve taken to try to understand this point of view. A view can’t be rejected until its understood...I’m sure people here have considered the AI-is-impossible view before, but I hadn’t.
What is the physical materialist view on whether reality is discrete? (I would guess it’s agnostic.) What is the AI view on whether computations must be discrete? (I would guess AI researchers wouldn’t eschew a continuous computation as as a non-computational thing if it were possible?)
I agree it’s important to apply the principle of charity, but people have to apply the principle of effort too. If Sharkey’s point is about some crucial threshold that continuous systems possess, he should say so. The term “computational” is already taken, so he needs to find another term.
And he can’t be excused on the grounds that “it’s a short interview”, considering that he repeated the same point several times and seemed to find enough space to spell out what (he thinks) his view implies.
(Phase 1)
Agreed, I don’t see why the mind isn’t a type of “computer”, and why living organisms aren’t “machines”. If there was something truly different and special about being organic, then we could just build an organic AI. I don’t get the distinction being made.
(Phase 2)
Oh: sounds like dualism of some kind if it is impossible for a machine to have empathy, compassion or understanding. Meaning beings with these qualities are more than physical machines, somehow.
(Phase 3)
Reading through some of the comments to the article, it sounds like the objection isn’t that intelligence is necessarily non-physical, but that “computation” doesn’t encompass all possible physical activity. I guess the idea is that if reality is continuous, then there could be some kind of complexity gap between discrete computation and an organic process.
Phases 1-3 is the sequential steps I’ve taken to try to understand this point of view. A view can’t be rejected until its understood...I’m sure people here have considered the AI-is-impossible view before, but I hadn’t.
What is the physical materialist view on whether reality is discrete? (I would guess it’s agnostic.) What is the AI view on whether computations must be discrete? (I would guess AI researchers wouldn’t eschew a continuous computation as as a non-computational thing if it were possible?)
I agree it’s important to apply the principle of charity, but people have to apply the principle of effort too. If Sharkey’s point is about some crucial threshold that continuous systems possess, he should say so. The term “computational” is already taken, so he needs to find another term.
And he can’t be excused on the grounds that “it’s a short interview”, considering that he repeated the same point several times and seemed to find enough space to spell out what (he thinks) his view implies.