Hmm, I want a term that refers to all those many dimensions together, since for any given ‘shared reality’ experience it might be like 30% concepts, 30% visual & auditory, 30% emotion/values, etc.
I’m down to factor them out and refer to shared emotions/facts/etc, but I still want something that gestures at the larger thing. Shared experience I think could do the trick, but feels a bit too subjective because it often involves interpretations of the world that feel like ‘true facts’ to the observer.
Wherein I write more, because I’m excited about all this:
The first time I heard the term ‘shared reality’ was in this podcast with Bruce Ecker, the guy who co-wrote Unlocking the Emotional Brain. He was giving an example of how a desire for ‘shared reality’ can make it hard to come to terms with e.g. emotional trauma.
by believing the parent’s negative messages to you (either verbal or behavioral), you’re staying in shared reality: and that’s a big aspect of attachment. … especially shared reality about yourself: ‘they think I’m a piece of crap, and I do too. So I feel seen and known by them even if the content is negative’.
In this case, the parent thinks the kid is a ‘piece of crap’, which I expect doesn’t feel like an emotion to the parent, it feels like a fact about the world. If they were more intellectually mature they might notice that this was an evaluation—but it’s actually super hard to disentangle evaluations and facts.
I guess I think it’s maybe impossible to disentangle them in many cases? Like… I think typically ‘facts’ are not a discrete thing that we can successfully point at, that they are typically tied up with intentions/values/feelings/frames/functions. I think Dreyfus made this critique of early attempts on AI, and I think he ended up being right (or at least my charitable interpretation of his point) - that it’s only within an optimization process / working for something that knowledge (knowing what to do given XYZ) gets created.
Maybe this is an is/ought thing. I certainly think there’s an external world/territory and it’s important to distinguish between that and our interpretations of it. And we can check our interpretations against the world to see how ‘factual’ they are. And there are models of that world like physics that aren’t tied up in some specific intention. But I think the ‘ought’ frame slips into things as soon as we take any action, because we’re inherently prioritizing our attention/efforts/etc. So even a sharing of ‘facts’ involves plenty of ought/values in the frame (like the value of truth-seeking).
I very much agree that when you’re not getting feeling X it can be very difficult to distinguish territory disagreements from feeling disagreements, especially when you’re SNS activated. Having a term to cover all cases seems extremely useful. It also seems useful to have specific terms for the subsets, to help tease issues apart.
“Reality” to me seems much better suited towards the narrow, territory-focused aspect of feeling X, and I see a lot of costs in diluting it. Both because I wish I could say “reality” instead of clunkier things like “narrow, territory-focused”, and because having a term where it’s ambiguous whether you mean feeling X or objective facts is just begging for explosive arguments. I’m particularly worried about person A’s (inside view) refusal to change their view of facts without new data feeling to person B like a refusal to care about feeling X.
Ideas for feeling X:
“Shared subjective reality” isn’t great because it’s kind of long and the whole point of reality is it’s not subjective, but does substantially address my concerns, in a way “reality*” doesn’t.
“being ingroup” or “on your side-ness”. These aren’t synonymous with feeling X but are a lot of what I want out of it.
“shared frame” also is not synonymous but captures an important aspect.
man I really wanted a longer, better list but it’s pretty hard.
Also thanks for starting this conversation, I’m finding it really valuable even if that’s manifesting mostly as critique.
I have a hunch that in practice the use of the term ‘shared reality’ doesn’t actually ruin one’s ability to refer to territory-reality. In the instances when I’ve used the term in conversation I haven’t noticed this (and I like to refer to the territory a lot). But maybe with more widespread usage and misinterpretation it could start to be a problem?
I think to get a better sense of your concern it might be useful to dive into specific conversations/dynamics where this might go wrong.
...
I can imagine a world where I want to be able to point out that someone is doing the psychological mistake of confusing their desire to connect with their map-making. And I want the term I use to do that work, so I can just say “you want to share your subjective experience with me, but I’m disagreeing with you about reality, not subjective experience.”
Yeah I definitely don’t think calling it “shared reality” will ruin anything. It would be another few snowflakes in the avalanche of territory-map ambiguation, similar to when people use “true” to mean “good” rather than “factually accurate”.
I’ve made a couple of attempts at a longer response and just keep bouncing off, so I think I’m out of concepts for now. Would love to pick this up in person if we run into each other.
Hmm, I want a term that refers to all those many dimensions together, since for any given ‘shared reality’ experience it might be like 30% concepts, 30% visual & auditory, 30% emotion/values, etc.
I’m down to factor them out and refer to shared emotions/facts/etc, but I still want something that gestures at the larger thing. Shared experience I think could do the trick, but feels a bit too subjective because it often involves interpretations of the world that feel like ‘true facts’ to the observer.
Wherein I write more, because I’m excited about all this:
The first time I heard the term ‘shared reality’ was in this podcast with Bruce Ecker, the guy who co-wrote Unlocking the Emotional Brain. He was giving an example of how a desire for ‘shared reality’ can make it hard to come to terms with e.g. emotional trauma.
In this case, the parent thinks the kid is a ‘piece of crap’, which I expect doesn’t feel like an emotion to the parent, it feels like a fact about the world. If they were more intellectually mature they might notice that this was an evaluation—but it’s actually super hard to disentangle evaluations and facts.
I guess I think it’s maybe impossible to disentangle them in many cases? Like… I think typically ‘facts’ are not a discrete thing that we can successfully point at, that they are typically tied up with intentions/values/feelings/frames/functions. I think Dreyfus made this critique of early attempts on AI, and I think he ended up being right (or at least my charitable interpretation of his point) - that it’s only within an optimization process / working for something that knowledge (knowing what to do given XYZ) gets created.
Maybe this is an is/ought thing. I certainly think there’s an external world/territory and it’s important to distinguish between that and our interpretations of it. And we can check our interpretations against the world to see how ‘factual’ they are. And there are models of that world like physics that aren’t tied up in some specific intention. But I think the ‘ought’ frame slips into things as soon as we take any action, because we’re inherently prioritizing our attention/efforts/etc. So even a sharing of ‘facts’ involves plenty of ought/values in the frame (like the value of truth-seeking).
I very much agree that when you’re not getting feeling X it can be very difficult to distinguish territory disagreements from feeling disagreements, especially when you’re SNS activated. Having a term to cover all cases seems extremely useful. It also seems useful to have specific terms for the subsets, to help tease issues apart.
“Reality” to me seems much better suited towards the narrow, territory-focused aspect of feeling X, and I see a lot of costs in diluting it. Both because I wish I could say “reality” instead of clunkier things like “narrow, territory-focused”, and because having a term where it’s ambiguous whether you mean feeling X or objective facts is just begging for explosive arguments. I’m particularly worried about person A’s (inside view) refusal to change their view of facts without new data feeling to person B like a refusal to care about feeling X.
Ideas for feeling X:
“Shared subjective reality” isn’t great because it’s kind of long and the whole point of reality is it’s not subjective, but does substantially address my concerns, in a way “reality*” doesn’t.
“being ingroup” or “on your side-ness”. These aren’t synonymous with feeling X but are a lot of what I want out of it.
“shared frame” also is not synonymous but captures an important aspect.
man I really wanted a longer, better list but it’s pretty hard.
Also thanks for starting this conversation, I’m finding it really valuable even if that’s manifesting mostly as critique.
Sure! I love talking about this concept-cluster.
I have a hunch that in practice the use of the term ‘shared reality’ doesn’t actually ruin one’s ability to refer to territory-reality. In the instances when I’ve used the term in conversation I haven’t noticed this (and I like to refer to the territory a lot). But maybe with more widespread usage and misinterpretation it could start to be a problem?
I think to get a better sense of your concern it might be useful to dive into specific conversations/dynamics where this might go wrong.
...
I can imagine a world where I want to be able to point out that someone is doing the psychological mistake of confusing their desire to connect with their map-making. And I want the term I use to do that work, so I can just say “you want to share your subjective experience with me, but I’m disagreeing with you about reality, not subjective experience.”
Does that kind of resonate with your concern?
Yeah I definitely don’t think calling it “shared reality” will ruin anything. It would be another few snowflakes in the avalanche of territory-map ambiguation, similar to when people use “true” to mean “good” rather than “factually accurate”.
I’ve made a couple of attempts at a longer response and just keep bouncing off, so I think I’m out of concepts for now. Would love to pick this up in person if we run into each other.
Yeah let’s do in-person sometime, I also tried drafting long responses and they were terrible