My psychologist said today, that there is some information that should not be known. I replied that rationalists believe in reality. There might be information they don’t find interesting (e.g. not all of you would find children interesting), but refusing to accept some information would mean refusing to accept some part of reality, and that would be against the belief in reality.
Since I have been recently asking myself the question “why do I believe what I believe” and “what would happen if I believed otherwise than what I believe” (I’m still pondering if I should post my cogitations: they interesting, but somewhat private) I asked the question “what would happen if rationalists believed otherwise than what they believe”. The problem is that this is such a backwards description that I can’t imagine the answer. Is the answer simply “they would be normal people, like my psychologist”? Or is it a deeper question?
Did your psychologist describe the type of information that should not be known?
In any case, I’m not completely sure that accepting new information (never mind seeking it out) is always fully compatible with rationality-as-winning. Nick Bostrom for example has compiled a taxonomy of information hazards over on his site; any of them could potentially be severe enough to overcome the informational advantage of their underlying data. Of course, they do seem to be pretty rare, and I don’t think a precautionary principle with regard to information is justified in the absence of fairly strong and specific reasoning.
No, it was more of a general statement. AFAIR we were talking about me thinking too much about why other people do what they do and too little about how that affects me. Anyway—my own wording made me wonder more about what I said than what was the topic.
Many thanks for the link to the Information Hazards paper. I didn’t know it existed, and I’m sort of surprised that I hadn’t seen it here on LW already.
He mentions intending to write a follow-up paper toward the end, but I located the Information Hazards Bostrom’s website and I don’t see a second one next to it. Any idea if it exists?
what would happen if rationalists believed otherwise than what they believe
They wouldn’t be rationalists anymore, duh.
Taboo “rationalists”: What would happen if you stopped trying to change your map to better reflect the territory? It most probably would reflect the territory less.
Is the answer simply “they would be normal people, like my psychologist”?
“Normal people” are not all the same. (For example, many “normal people” are unlike your psychologist.) Which of the many subkinds of the “normal people” do you mean?
Some things are unrelated. For example, let’s suppose that you are a rationalist, and you also have a broken leg. That’s two things that make you different from the average human. But those two things are unrelated. It would be a mistake to think—an average human doesn’t have a broken leg; by giving up my rationality I will become more similar to the average human, therefore giving up my rationality will heal my leg.
Replace “broken leg” with whatever problem you are discussing with your psychologist. Do you have evidence that rational people are more likely to have this specific problem than irrational (but otherwise similar: same social background, same education, same character, same health problems) people?
Taboo “rationalists”: What would happen if you stopped trying to change your map to better reflect the territory? It most probably would reflect the territory less.
That’s a behavior and no belief.
It most probably would reflect the territory less.
There are many instance where trying to change a belief makes the belief stronger. People who are very much attached to their beliefs usually don’t update.
Many mainstream professional psychologist follows a code that means that he doesn’t share deep information about his own private life with his clients.
I don’t believe in that ideal of professionalism but it’s not straightforward to dismiss it.
More importantly a good psychologist doesn’t confront his client with information about the client that’s not helpful for them. He doesn’t say: “Your life is a mess because of points 1 to 30.” That’s certainly information that’s interesting to the client but not helpful. It makes much more sense to let the client figure out stuff on his own or to guide him to specific issues that the client is actually in a position to change.
Monday I gave someone meaningful true information about them that I consider helpful to them their first reaction was: “I don’t want to have nightmares. Don’t give them to me.”
I do have a policy of being honest but that doesn’t entail telling someone true information for which they didn’t ask and that messes them up. I don’t think that any good psychologist will just share all information that are available. It just a bad strategy when you are having a discussion about intimate personal topics.
Well, some people don’t want to be given information, and some people do. It’s often difficult to know where a specific person belongs; and it is a reasonable assumption that they most likely belong to the “don’t want to know” group.
The problem with saying “some information should not be known” is that it does not specify who shouldn’t know (and why).
Well, some people don’t want to be given information, and some people do.
Whether a person want to be given information doesn’t mean that he can handle the information. I can remember a few instance where I swear that I wanted information but wasn’t well equipped to handle them.
The problem with saying “some information should not be known” is that it does not specify who shouldn’t know (and why).
That sentence alone doesn’t but the psychologist probably had a context in which he spoke it.
Gah. Now I think I shouldn’t have included the background for my question.
FYI, what I wrote in response to some other comment:
it was more of a general statement. AFAIR we were talking about me thinking too much about why other people do what they do (hence—I have too much information about them) and too little about how that affects me. Anyway—my own wording made me wonder more about what I said than what was the topic.
My psychologist said today, that there is some information that should not be known. I replied that rationalists believe in reality. There might be information they don’t find interesting (e.g. not all of you would find children interesting), but refusing to accept some information would mean refusing to accept some part of reality, and that would be against the belief in reality.
Since I have been recently asking myself the question “why do I believe what I believe” and “what would happen if I believed otherwise than what I believe” (I’m still pondering if I should post my cogitations: they interesting, but somewhat private) I asked the question “what would happen if rationalists believed otherwise than what they believe”. The problem is that this is such a backwards description that I can’t imagine the answer. Is the answer simply “they would be normal people, like my psychologist”? Or is it a deeper question?
Did your psychologist describe the type of information that should not be known?
In any case, I’m not completely sure that accepting new information (never mind seeking it out) is always fully compatible with rationality-as-winning. Nick Bostrom for example has compiled a taxonomy of information hazards over on his site; any of them could potentially be severe enough to overcome the informational advantage of their underlying data. Of course, they do seem to be pretty rare, and I don’t think a precautionary principle with regard to information is justified in the absence of fairly strong and specific reasoning.
No, it was more of a general statement. AFAIR we were talking about me thinking too much about why other people do what they do and too little about how that affects me. Anyway—my own wording made me wonder more about what I said than what was the topic.
Many thanks for the link to the Information Hazards paper. I didn’t know it existed, and I’m sort of surprised that I hadn’t seen it here on LW already.
He mentions intending to write a follow-up paper toward the end, but I located the Information Hazards Bostrom’s website and I don’t see a second one next to it. Any idea if it exists?
They wouldn’t be rationalists anymore, duh.
Taboo “rationalists”: What would happen if you stopped trying to change your map to better reflect the territory? It most probably would reflect the territory less.
“Normal people” are not all the same. (For example, many “normal people” are unlike your psychologist.) Which of the many subkinds of the “normal people” do you mean?
Some things are unrelated. For example, let’s suppose that you are a rationalist, and you also have a broken leg. That’s two things that make you different from the average human. But those two things are unrelated. It would be a mistake to think—an average human doesn’t have a broken leg; by giving up my rationality I will become more similar to the average human, therefore giving up my rationality will heal my leg.
Replace “broken leg” with whatever problem you are discussing with your psychologist. Do you have evidence that rational people are more likely to have this specific problem than irrational (but otherwise similar: same social background, same education, same character, same health problems) people?
That’s a behavior and no belief.
There are many instance where trying to change a belief makes the belief stronger. People who are very much attached to their beliefs usually don’t update.
Many mainstream professional psychologist follows a code that means that he doesn’t share deep information about his own private life with his clients. I don’t believe in that ideal of professionalism but it’s not straightforward to dismiss it.
More importantly a good psychologist doesn’t confront his client with information about the client that’s not helpful for them. He doesn’t say: “Your life is a mess because of points 1 to 30.” That’s certainly information that’s interesting to the client but not helpful. It makes much more sense to let the client figure out stuff on his own or to guide him to specific issues that the client is actually in a position to change.
Monday I gave someone meaningful true information about them that I consider helpful to them their first reaction was: “I don’t want to have nightmares. Don’t give them to me.”
I do have a policy of being honest but that doesn’t entail telling someone true information for which they didn’t ask and that messes them up. I don’t think that any good psychologist will just share all information that are available. It just a bad strategy when you are having a discussion about intimate personal topics.
Well, some people don’t want to be given information, and some people do. It’s often difficult to know where a specific person belongs; and it is a reasonable assumption that they most likely belong to the “don’t want to know” group.
The problem with saying “some information should not be known” is that it does not specify who shouldn’t know (and why).
Whether a person want to be given information doesn’t mean that he can handle the information. I can remember a few instance where I swear that I wanted information but wasn’t well equipped to handle them.
That sentence alone doesn’t but the psychologist probably had a context in which he spoke it.
Gah. Now I think I shouldn’t have included the background for my question.
FYI, what I wrote in response to some other comment:
But reading you is still interesting.
So information that shouldn’t be known?
Your psychologist’s job is to help you learn to live in the real world. Advocacy of selective ignorance is highly suspect.