Cyan—think of the million monkeys at typewriters eventually outputting a replica of Chalmers’ book. The monkeys obviously haven’t given an argument. There’s just an item there that you are capable of projecting a meaningful interpretation onto. But the meaning obviously comes from you, not the monkeys.
Credulous—I’m not entirely sure what you’re asking. I think an agent could still have qualia without believing that this is so on a theoretical level. (Dennett springs to mind!) But I guess if you tinkered with the internal computational processes enough, you might eventually succeed in ridding the agent of [the neural underpinnings of] phenomenal representations (e.g. of pain) altogether. It would then behave very differently.
PK—Yep, you’re so very special that you’re the only discussant in this conversation who’s made entirely of straw!
Cyan—think of the million monkeys at typewriters eventually outputting a replica of Chalmers’ book. The monkeys obviously haven’t given an argument. There’s just an item there that you are capable of projecting a meaningful interpretation onto. But the meaning obviously comes from you, not the monkeys.
Credulous—I’m not entirely sure what you’re asking. I think an agent could still have qualia without believing that this is so on a theoretical level. (Dennett springs to mind!) But I guess if you tinkered with the internal computational processes enough, you might eventually succeed in ridding the agent of [the neural underpinnings of] phenomenal representations (e.g. of pain) altogether. It would then behave very differently.
PK—Yep, you’re so very special that you’re the only discussant in this conversation who’s made entirely of straw!