Yes. I’m wondering if these dispute simply resolve to having different subjective experiences of what it means to be alive. In fact maybe the mistake is assuming that p-zombies don’t exist. Maybe some humans are p-zombies!
However,
that’s just how a sufficiently advanced self-monitoring system would feel from the inside.
seems like almost a contradiction in terms. Can a self monitoring system become sufficiently advanced without feeling anything (just as my computer computes, but I suppose, doesn’t feel)?
Because it seems like the most plausible explanation for the fact that I feel, to the extent that I do. (also it explains the otherwise quite confusing result that our decision-making processes activate after we’ve acted for many kinds of actions, even though we feel like our decision determined the action).
Yes. I’m wondering if these dispute simply resolve to having different subjective experiences of what it means to be alive. In fact maybe the mistake is assuming that p-zombies don’t exist. Maybe some humans are p-zombies!
However,
seems like almost a contradiction in terms. Can a self monitoring system become sufficiently advanced without feeling anything (just as my computer computes, but I suppose, doesn’t feel)?
I think not. But I think that makes it entirely unsurprising, obvious even, that a more advanced computer would feel.
If so, I want to know why.
Because it seems like the most plausible explanation for the fact that I feel, to the extent that I do. (also it explains the otherwise quite confusing result that our decision-making processes activate after we’ve acted for many kinds of actions, even though we feel like our decision determined the action).
I don’t know what that second thing has to do with consciousness.