Isn’t dissolving the concept of personal identity relatively straightforward?
Nay, I don’t think it is.
I don’t take issue with anything in particular you said in this comment, but it doesn’t feel like a classic, non-greedy Reduction of the style used to reduce free will into cognitive algorithms or causality into math.
The sense in which you can create another entity arbitrarily like yourself and say, “I identify with this creature based on so-and-so definition” and then have different experiences than the golem no matter how like you it is is the confused concept that I do not think has been dissolved; I am not sure if it a non-fake dissolving of it has ever even started. (Example: Susan Blackmore’s recent “She Won’t Be Me”. This is clearly a fake reduction; you don’t get to escape the difficulties of personal identity confusion by saying a new self pops up every few minutes/seconds/plancktimes. Your comment is less obviously wrong but still sidesteps the confusion instead of Solving it.)
Hell, it’s not just a Confusing Problem. I’d say it’s a good candidate for The Most Confusing Problem.
Edit (one of many little ones): I made this comment pretty poorly, but I hope the point both makes sense and got through relatively intact. Mitchell Porter’s comment is also really good until the penultimate paragraph.
The sense in which you can create another entity arbitrarily like yourself and say, “I identify with this creature based on so-and-so definition” and then have different experiences than the golem no matter how like you it is is the confused concept that I do not think has been dissolved; I am not sure if it a non-fake dissolving of it has ever even started.
I tried responding to this example, but I find the whole example so foreign and confused in some sense that I don’t even know how to make enough sense of it to offer a critique or an explanation. Why wouldn’t you expect there to exist an entity with different experiences than the golem, and which remembers having identified with the golem? You’re not killing it, after all.
Nay, I don’t think it is.
I don’t take issue with anything in particular you said in this comment, but it doesn’t feel like a classic, non-greedy Reduction of the style used to reduce free will into cognitive algorithms or causality into math.
The sense in which you can create another entity arbitrarily like yourself and say, “I identify with this creature based on so-and-so definition” and then have different experiences than the golem no matter how like you it is is the confused concept that I do not think has been dissolved; I am not sure if it a non-fake dissolving of it has ever even started. (Example: Susan Blackmore’s recent “She Won’t Be Me”. This is clearly a fake reduction; you don’t get to escape the difficulties of personal identity confusion by saying a new self pops up every few minutes/seconds/plancktimes. Your comment is less obviously wrong but still sidesteps the confusion instead of Solving it.)
Hell, it’s not just a Confusing Problem. I’d say it’s a good candidate for The Most Confusing Problem.
Edit (one of many little ones): I made this comment pretty poorly, but I hope the point both makes sense and got through relatively intact. Mitchell Porter’s comment is also really good until the penultimate paragraph.
I tried responding to this example, but I find the whole example so foreign and confused in some sense that I don’t even know how to make enough sense of it to offer a critique or an explanation. Why wouldn’t you expect there to exist an entity with different experiences than the golem, and which remembers having identified with the golem? You’re not killing it, after all.