If it’s merely about me preferring “contextualizing norms”, then I should be able to, in the context of a scientific study, be able to recognize that the context is such that I can basically just tell the truth.
However, if I’ve gotten to a point where I literally can’t separate out social signalling from truth signalling (Simulacra level 3), then you’d expect a result like you see here.
I thought about linking that, but decided against it because I feel like that’s mostly about rationalists getting confused about contextualization and needing a guide to understand it, especially confusion about the ways that people who care more about social reality than “physical” reality pay more attention to how other people will think about words rather than what the words nominally are agreed to mean, rather than about what what it means to think in a highly contextualized way. It’s somewhat adjacent, as a causal sibling of the phenomenon being asked about in this post.
If it’s merely about me preferring “contextualizing norms”, then I should be able to, in the context of a scientific study, be able to recognize that the context is such that I can basically just tell the truth.
Maybe it’s just your phrasing, but I feel like this is subtly missing what it means to contextualize by supposing you can create a context where something can be left out, like saying let me create a new set of everything that doesn’t include everything.
I confused by what you mean when you say “just tell the truth”. The only interpretation that comes to mind is one where you mean something like the contextualized perspective is not capable of saying anything true, and that seems insufficiently charitable.
I think contextualization allows something like understanding how the study intends for me to respond and using that to guess the teacher’s password, rather than falling for what I would consider the epistemic trap of thinking the study’s isolating perspective is the “real” one. Maybe that’s what you meant?
Maybe it’s just your phrasing, but I feel like this is subtly missing what it means to contextualize by supposing you can create a context where something can be left out, like saying let me create a new set of everything that doesn’t include everything.
I confused by what you mean when you say “just tell the truth”. The only interpretation that comes to mind is one where you mean something like the contextualized perspective is not capable of saying anything true, and that seems insufficiently charitable.
I think contextualization allows something like understanding how the study intends for me to respond and using that to guess the teacher’s password, rather than falling for what I would consider the epistemic trap of thinking the study’s isolating perspective is the “real” one. Maybe that’s what you meant?
I think a proper contextualizing perspective would recognize that the study’s isolated perspective is indeed one of the most relevant perspectives when in the study. If I’m tracking what people will think of me when in fact what I do during the study won’t get back to people I care about at all, I’m not properly tracking context, instead of I’ve internalized tribal perspectives so much that I can’t actually separate them from real context.
To me this is what separates Simulacra levels from contextualizing.
People got post-research interviewed and asked to explain their answers. There were social feedback mechanisms. Even if there wasn’t peer to peer social feedback, it was certainly possible to annoy the authority (researchers) who is giving you the questions (like annoying your teacher who gives you a test). The researchers want you to answer a particular way so people, reasonably, guess what that is, even if they don’t already have that way highly internalized (as most people do).
This is how people have learned to deal with questions in general. And people are correct to be very wary of guessing “it’s safe to be literal now” (often when it looks safe, it’s not, so people come to the reasonable rule of thumb that it’s never safe and basically decide (but not as a conscious decision) that maintaining a literalist personality to be used very rarely, when it’s hard to even identify any safe times to use it, is not worth the cost). People have near-zero experience in situations where being hyper literal (or whatever you want to call it) won’t be punished. Those scenarios barely exist. Even science, academia or Less Wrong mostly aren’t like that.
This feels highly related to Simulacra levels.
If it’s merely about me preferring “contextualizing norms”, then I should be able to, in the context of a scientific study, be able to recognize that the context is such that I can basically just tell the truth.
However, if I’ve gotten to a point where I literally can’t separate out social signalling from truth signalling (Simulacra level 3), then you’d expect a result like you see here.
I thought about linking that, but decided against it because I feel like that’s mostly about rationalists getting confused about contextualization and needing a guide to understand it, especially confusion about the ways that people who care more about social reality than “physical” reality pay more attention to how other people will think about words rather than what the words nominally are agreed to mean, rather than about what what it means to think in a highly contextualized way. It’s somewhat adjacent, as a causal sibling of the phenomenon being asked about in this post.
Maybe it’s just your phrasing, but I feel like this is subtly missing what it means to contextualize by supposing you can create a context where something can be left out, like saying let me create a new set of everything that doesn’t include everything.
I confused by what you mean when you say “just tell the truth”. The only interpretation that comes to mind is one where you mean something like the contextualized perspective is not capable of saying anything true, and that seems insufficiently charitable.
I think contextualization allows something like understanding how the study intends for me to respond and using that to guess the teacher’s password, rather than falling for what I would consider the epistemic trap of thinking the study’s isolating perspective is the “real” one. Maybe that’s what you meant?
I think a proper contextualizing perspective would recognize that the study’s isolated perspective is indeed one of the most relevant perspectives when in the study. If I’m tracking what people will think of me when in fact what I do during the study won’t get back to people I care about at all, I’m not properly tracking context, instead of I’ve internalized tribal perspectives so much that I can’t actually separate them from real context.
To me this is what separates Simulacra levels from contextualizing.
People got post-research interviewed and asked to explain their answers. There were social feedback mechanisms. Even if there wasn’t peer to peer social feedback, it was certainly possible to annoy the authority (researchers) who is giving you the questions (like annoying your teacher who gives you a test). The researchers want you to answer a particular way so people, reasonably, guess what that is, even if they don’t already have that way highly internalized (as most people do).
This is how people have learned to deal with questions in general. And people are correct to be very wary of guessing “it’s safe to be literal now” (often when it looks safe, it’s not, so people come to the reasonable rule of thumb that it’s never safe and basically decide (but not as a conscious decision) that maintaining a literalist personality to be used very rarely, when it’s hard to even identify any safe times to use it, is not worth the cost). People have near-zero experience in situations where being hyper literal (or whatever you want to call it) won’t be punished. Those scenarios barely exist. Even science, academia or Less Wrong mostly aren’t like that.
More related to this in my followup post: Asch Conformity Could Explain the Conjunction Fallacy