Paul Graham suggests keeping your identity as small as sustainable. [1] That is, it’s beneficial to keep your identity to just “rationalist” or just “scientist”, since they contradict having a large identity. He puts it better than I do:
There may be some things it’s a net win to include in your identity. For example, being a scientist. But arguably that is more of a placeholder than an actual label—like putting NMI on a form that asks for your middle initial—because it doesn’t commit you to believing anything in particular. A scientist isn’t committed to believing in natural selection in the same way a bibilical literalist is committed to rejecting it. All he’s committed to is following the evidence wherever it leads.
Considering yourself a scientist is equivalent to putting a sign in a cupboard saying “this cupboard must be kept empty.” Yes, strictly speaking, you’re putting something in the cupboard, but not in the ordinary sense.
This goes well for belief’s included in your identity, but I’ve always been uncertain about it it’s supposed to also extend to things like episodic memories (separated from believing the information contained in them), realtionship in neutral groups such as a family or a fandom, precommitments, or mannerisms?
things like episodic memories (separated from believing the information contained in them)
I’m not sure what you’re saying here; you think of your memories as part of your identity?
realtionship[sic] in neutral groups such as a family or a fandom, precommitments, or mannerisms?
These memberships are all heuristics for expected interactions with people. Nothing actionable is lost if you bayes-induct for each situation separately, save the effort you’re using to compute and the cognitive biases and emotional reactions you get from claiming “membership”. Alternately you could still use the membership heuristic, but with a mental footnote that you’re only using it because it’s convenient, and there are senses in which the membership’s representation of you may be misleading.
@episodic memories: I don’t personally have any like that, but I hear many people do consider the subjective experience of pivotal events in their life as part of who they are.
@relationships: I’m talking the literal membership here, the thing that exists as a function of the entanglement between states in different brains.
To clarify, I’m not talking about “your identity” here as in the information about what you consider your identity, but rather the referent of that identity. To many people, their physical bodies are part of their identity in this sense. Even distant objects, or large organizations like nations, can be in extreme cases. Just because it’s a trend here to only have information that resides in your own brain as part of your identity doesn’t mean it’s necessary, or even especially common in it’s pre form in most places.
To clarify, I’m not talking about “your identity” here as in the information about what you consider your identity, but rather the referent of that identity.
Ah, it appears we’re talking about different things. I’m referring to ideological identity (“I’m a rationalist” , “I’m a libertarian”, “I’m pro-choice”, “I’m an activist” ), which I think is distinct from “I’m my mind” identity. In particular, you can be primed psychologically and emotionally by the former more than the latter.
How much of an identity is just right?
“I’m a gorgeous blonde child who roams the forest alone stealing food from bears.” is just right.
Paul Graham suggests keeping your identity as small as sustainable. [1] That is, it’s beneficial to keep your identity to just “rationalist” or just “scientist”, since they contradict having a large identity. He puts it better than I do:
[1] http://www.paulgraham.com/identity.html
This goes well for belief’s included in your identity, but I’ve always been uncertain about it it’s supposed to also extend to things like episodic memories (separated from believing the information contained in them), realtionship in neutral groups such as a family or a fandom, precommitments, or mannerisms?
I’m not sure what you’re saying here; you think of your memories as part of your identity?
These memberships are all heuristics for expected interactions with people. Nothing actionable is lost if you bayes-induct for each situation separately, save the effort you’re using to compute and the cognitive biases and emotional reactions you get from claiming “membership”. Alternately you could still use the membership heuristic, but with a mental footnote that you’re only using it because it’s convenient, and there are senses in which the membership’s representation of you may be misleading.
@episodic memories: I don’t personally have any like that, but I hear many people do consider the subjective experience of pivotal events in their life as part of who they are.
@relationships: I’m talking the literal membership here, the thing that exists as a function of the entanglement between states in different brains.
To clarify, I’m not talking about “your identity” here as in the information about what you consider your identity, but rather the referent of that identity. To many people, their physical bodies are part of their identity in this sense. Even distant objects, or large organizations like nations, can be in extreme cases. Just because it’s a trend here to only have information that resides in your own brain as part of your identity doesn’t mean it’s necessary, or even especially common in it’s pre form in most places.
Ah, it appears we’re talking about different things. I’m referring to ideological identity (“I’m a rationalist” , “I’m a libertarian”, “I’m pro-choice”, “I’m an activist” ), which I think is distinct from “I’m my mind” identity. In particular, you can be primed psychologically and emotionally by the former more than the latter.
It seems like we both, and possibly the original Keeping Your Identity Small article, are committing the typical mind fallacy.
My guess would be only as large as necessary to capture your terminal values, in so far as humans have terminal values.
“How much” I’m not sure, but a strategy that I find promising and that is rarely talked about is identity min-maxing.