I don’t do selfishness. Partially this is because I don’t believe in personal identity, partially because I feel like there’s no fundamental difference between me and anyone else, and partially because it doesn’t make sense to me intuitively. As such, I try to help myself as much as is necessary to help people in general. Anything more is akrasia.
You generally should donate all of your money to one cause. If you’re “donating to yourself”, there’s significantly diminishing returns, so you’d figure out how much you value the two comparatively, and figure out at what point the marginal cost of your happiness is equal to the value of the best charity.
Incidentally, this works out as having a certain amount of money you keep, and then everything beyond that you donate, regardless of how much you actually make. As such, if you’re Bill Gates, you’d donate virtually all your money to charity.
I would expect this to be at −1 or −2 ; I’m curious as to why this was downvoted so much, given that the author seems to be stating his views sincerely.
Having just read “I Am A Strange Loop” by Douglas Hofstadter, I’m inclined to take Daniel’s statement at face value as well. I am very put off by the kind of thinking that produces such a statement, but I guess if you REALLY think you have no personal identity, I can’t argue too convincingly against you. The only thing I have to say is this: do you think that your own instance of yourself would mind if you used a destructive teleporter?
I don’t do selfishness. Partially this is because I don’t believe in personal identity, partially because I feel like there’s no fundamental difference between me and anyone else, and partially because it doesn’t make sense to me intuitively. As such, I try to help myself as much as is necessary to help people in general.
I agree with you about personal identity but I don’t think this ontological fact implies that people have no reason to behave in ways we call ‘selfish’. A person’s seemingly irrational concern for ‘themselves’ can be regarded as a proxy for concern about the various projects that person is involved in, which would go awry in their absence.
(However, I suspect this is only a small ingredient of the psychological explanation for why people behave ‘selfishly’, and to the extent that other factors are involved, they are non-rational. But let’s put this in perspective—the decision to digest or throw up your stomach contents is also non-rational. For the most part the ‘decisions’ made would seem ‘sensible’ to an outside observer. However, when your body is determined to make the ‘wrong’ decision one can’t (always) override it with free will alone. (It certainly doesn’t make sense to try to ‘talk your stomach round’, or ‘punish’ it if it chooses the wrong action.) Similarly, it’s to be expected that a person who shares our view about the non-existence of identity will continue to act selfishly for hidden reasons, even when this has nothing to do with ‘advancing their projects’.)
It’s only when we start talking about cryonics, teleportation and cloning that the hidden absurdity of ‘selfhood’ comes to light.
It’s only when we start talking about cryonics, teleportation and cloning that the hidden absurdity of ‘selfhood’ comes to light.
What absurdity? Here’s you, Neil, and that’s Tyler. It’s possible to tell who of you two is Neil and who is not. A copy-Neil might be about the same thing as Neil, but this doesn’t interfere with the simplicity of telling that Tyler is not the same thing. You can well care about Neil-like things more than about Tyler-like things. It’s plausible from an evolutionary psychology standpoint that humans care about themselves more than other people. By “myself” I mean “a thing like the one I’m pointing at”, and the rest is a process of evaluating this symbolic reference into a more self-contained definition; this simple reference is sufficient to specify the intended meaning.
What I meant was that the common situation whereby a person both (i) believes in persisting subjective identity (sameness of Cartesian Theater over time) and (ii) attaches massive importance to it (e.g. using words like ‘death’ to refer to its extinction), doesn’t obviously or frequently give rise to irrational decision-making until we start talking about things like cryonics, teleportation and cloning.
I apologise for the unclarity of my final sentence if you took me to be saying something stronger.
A person’s seemingly irrational concern for ‘themselves’ can be regarded as a proxy for concern about the various projects that person is involved in, which would go awry in their absence.
I think you might be under the illusion that desires are rational,
Reason is, and ought only to be the slave of the passions. - David Hume
Was this downvoted because people disagree with the ideas about the mind/personal identity or for some other reason?
I also think that I have no egoistic terms in my utility function and that utility functions which such term are likely incoherent due to being based on false ideas (as instantiated in humans, not in general; obviously it is possible to have a utility function that could be described as egoistic). I am unsure of whether my utility function has terms that are egoistic conditional on my philosophy of mind being wrong, which I assign maybe a 20% probability.
I don’t do selfishness. Partially this is because I don’t believe in personal identity, partially because I feel like there’s no fundamental difference between me and anyone else, and partially because it doesn’t make sense to me intuitively. As such, I try to help myself as much as is necessary to help people in general. Anything more is akrasia.
You generally should donate all of your money to one cause. If you’re “donating to yourself”, there’s significantly diminishing returns, so you’d figure out how much you value the two comparatively, and figure out at what point the marginal cost of your happiness is equal to the value of the best charity.
Incidentally, this works out as having a certain amount of money you keep, and then everything beyond that you donate, regardless of how much you actually make. As such, if you’re Bill Gates, you’d donate virtually all your money to charity.
I would expect this to be at −1 or −2 ; I’m curious as to why this was downvoted so much, given that the author seems to be stating his views sincerely.
For clarification, the parent is at −4 as I post.
Having just read “I Am A Strange Loop” by Douglas Hofstadter, I’m inclined to take Daniel’s statement at face value as well. I am very put off by the kind of thinking that produces such a statement, but I guess if you REALLY think you have no personal identity, I can’t argue too convincingly against you. The only thing I have to say is this: do you think that your own instance of yourself would mind if you used a destructive teleporter?
The only thing you could really say is “me” is the current instance of me. The later ones are, for all intents and purposes, other people.
A destructive teleporter can’t destroy the current instance of me, since it can’t erase the past.
I also note that of the anthropic trilemma, two cause paradoxes, and the other one (no personal identity) just seems counterintuitive to most people.
I agree with you about personal identity but I don’t think this ontological fact implies that people have no reason to behave in ways we call ‘selfish’. A person’s seemingly irrational concern for ‘themselves’ can be regarded as a proxy for concern about the various projects that person is involved in, which would go awry in their absence.
(However, I suspect this is only a small ingredient of the psychological explanation for why people behave ‘selfishly’, and to the extent that other factors are involved, they are non-rational. But let’s put this in perspective—the decision to digest or throw up your stomach contents is also non-rational. For the most part the ‘decisions’ made would seem ‘sensible’ to an outside observer. However, when your body is determined to make the ‘wrong’ decision one can’t (always) override it with free will alone. (It certainly doesn’t make sense to try to ‘talk your stomach round’, or ‘punish’ it if it chooses the wrong action.) Similarly, it’s to be expected that a person who shares our view about the non-existence of identity will continue to act selfishly for hidden reasons, even when this has nothing to do with ‘advancing their projects’.)
It’s only when we start talking about cryonics, teleportation and cloning that the hidden absurdity of ‘selfhood’ comes to light.
What absurdity? Here’s you, Neil, and that’s Tyler. It’s possible to tell who of you two is Neil and who is not. A copy-Neil might be about the same thing as Neil, but this doesn’t interfere with the simplicity of telling that Tyler is not the same thing. You can well care about Neil-like things more than about Tyler-like things. It’s plausible from an evolutionary psychology standpoint that humans care about themselves more than other people. By “myself” I mean “a thing like the one I’m pointing at”, and the rest is a process of evaluating this symbolic reference into a more self-contained definition; this simple reference is sufficient to specify the intended meaning.
What I meant was that the common situation whereby a person both (i) believes in persisting subjective identity (sameness of Cartesian Theater over time) and (ii) attaches massive importance to it (e.g. using words like ‘death’ to refer to its extinction), doesn’t obviously or frequently give rise to irrational decision-making until we start talking about things like cryonics, teleportation and cloning.
I apologise for the unclarity of my final sentence if you took me to be saying something stronger.
I think you might be under the illusion that desires are rational,
after all.
Was this downvoted because people disagree with the ideas about the mind/personal identity or for some other reason?
I also think that I have no egoistic terms in my utility function and that utility functions which such term are likely incoherent due to being based on false ideas (as instantiated in humans, not in general; obviously it is possible to have a utility function that could be described as egoistic). I am unsure of whether my utility function has terms that are egoistic conditional on my philosophy of mind being wrong, which I assign maybe a 20% probability.