The part where this gets difficult is understanding why we evolved to have conscious intentions in the first place. What purpose does it serve to make us actually want things, rather than simply act as if we wanted them? Why aren’t we like toasters?
This also gets at one of the reasons why I think it’s a fool’s errand to try to make the singularity with a non-sentient AI. If it were possible to make that level of intelligence without consciousness (and do so efficiently and cheaply), surely natural selection would have done so? Instead it made us sentient; this suggests that sentience is a useful thing to have.
The part where this gets difficult is understanding why we evolved to have conscious intentions in the first place. What purpose does it serve to make us actually want things, rather than simply act as if we wanted them? Why aren’t we like toasters?
This also gets at one of the reasons why I think it’s a fool’s errand to try to make the singularity with a non-sentient AI. If it were possible to make that level of intelligence without consciousness (and do so efficiently and cheaply), surely natural selection would have done so? Instead it made us sentient; this suggests that sentience is a useful thing to have.
For certain values of “act as if”, what you’re asking is why we aren’t p-zombies.
Yeah, and that’s an interesting question.