I would pick the black box, but it’s a hard choice. Given all the usual suppositions about Omega as a sufficiently trustworthy superintelligence, I would assume that the utilities really were as it said and take the false information. But it would be a painful, both because I want to be the kind of person who pursues and acts upon the truth, and also because I would be desperately curious to know what sort of true and non-misleading belief could cause that much disutility—was Lovecraft right after all? I’d probably try to bargain with Omega to let me know the true belief for just a minute before erasing it from my memory—but still, in the Least Convenient Possible World where my curiosity was never satisfied, I’d hold my nose and pick the black box.
Having answered the hypothetical, I’ll go on and say that I’m not sure there’s much to take from it. Clearly, I don’t value Truth for its own sake over and beyond all other considerations, let the heavens fall—but I never thought I did, and I doubt many here do. The point is that in the real world, where we don’t yet have trustworthy superintelligences, the general rule that your plans will go better when you use an accurate map doesn’t seem to admit of exceptions (and little though I understand Friendly AI, I’d be willing to bet that this rule holds post-singularity). Yes, there are times where you might be better off with a false belief, but you can’t predictably know in advance when that is, black swan blow-ups, etc.
To be more concrete, I don’t think there’s any real-world analogue to the hypothetical. If a consortium of the world’s top psychiatrists announced that, no really, believing in God makes people happier, more productive, more successful, etc., and that this conclusion holds even for firm atheists who work for years to argue themselves into knots of self-deception, and that this conclusion has the strongest sort of experimental support that you could expect in this field, I’d probably just shrug and say “I defy the data”. When it comes to purposeful self-deception, it really would take Omega to get me on board.
That’s exactly why the problem invokes Omega, yes. You need an awful lot of information to know which false beliefs actually are superior to the truth (and which facts might be harmful), and by the time you have it, it’s generally too late.
That said, the best real-world analogy that exists remains amnesia drugs. If you did have a traumatic experience, serious enough that you felt unable to cope with it, and you were experiencing PTSD or depression related to the trauma that impeded you from continuing with your life… but a magic pill could make it all go away, with no side effects, and with enough precision that you’d forget only the traumatic event… would you take the pill?
Okay, I suppose that probably is a more relevant question. The best answer I can give is that I would be extremely hesitant to do this. I’ve never experienced anything like this, so I’m open to the idea that there’s a pain here I simply can’t understand. But I would certainly want to work very hard to find a way to deal with the situation without erasing my memory, and I would expect to do better in the long-term because of it. Having any substantial part of my memory erased is a terrifying thought to me, as it’s really about the closest thing I can imagine to “experiencing” death.
But I also see a distinction between limiting your access to the truth for narrow, strategic reasons, and outright self-deception. There are all kinds of reasons one might want the truth withheld, especially when the withholding is merely a delay (think spoilers, the Bayesian Conspiracy, surprise parties for everyone except Alicorn, etc.). In those situations, I would still want to know that the truth was being kept for me, understand why it was being done, and most importantly, know under what circumstances it would be optimal to discover it.
So maybe amnesia drugs fit into that model. If all other solutions failed, I’d probably take them to make the nightmares stop, especially if I still had access to the memory and the potential to face it again when I was stronger. But I would still want to know there was something I blocked out and was unable to bear. What if the memory was lost forever and I could never even know that fact? That really does seem like part of me is dying, so choosing it would require the sort of pain that would make me wish for (limited) death—which is obviously pretty extreme, and probably more than I can imagine for a traumatic memory.
Yet, someone experiencing trauma that they are better off continuing to suffer would hypothetically lead to learned helplessness and worse depression. But it’s true, yet false belief is more productive.
That said, genetic epidemiology is weird and I haven’t looked at the literature beyodndon’t understand the literature beyond this book. I was prompted to investigate it based on some counterintuitive outcomes regarding treatment for psychological trauama and depressive symptomology, established counterintuitive results about mindfulness and depressive symptoms in Parkinsons and Schizophrenia, and some disclosed SNP’s sequences from a known individual.
Nobody makes plans based on totally accurate maps. Good maps contain simplifications of reality to allow you to make better decisions.
You start to teach children how atoms work by putting the image atoms as spheres into their heads. You don’t start by teaching them a model that’s up to date with the current scientific knowledge of how atoms works. The current model is more accurate but less useful for the children.
You calculate how airplanes fly with Newtons equations instead of using Einstein’s.
In social situations it can also often help to avoid getting certain information.
You don’t have job. You ask a friend to get you a job. The job pays well. He assures you that the work you are doing helps the greater good of the world.
He however also tells you that some of the people you will work with do things in their private lifes that you don’t like.
Would you want him to tell you that your new boss secretly burns little puppies at night? The boss also doesn’t take it kindly if people critizise him for it.
Would you want him to tell you that your new boss secretly burns little puppies at night? The boss also doesn’t take it kindly if people critizise him for it.
Well, yes, I would. Of course, it’s not like he could actually say to me “your boss secretly burns puppies—do you want to know this or not?” But if he said something like “your boss has a dark and disturbing secret which might concern you; we won’t get in trouble just for talking about it, but he won’t take kindly to criticism—do you want me to tell you?”, then yeah, I would definitely want to know. The boss is already burning puppies, so it’s not like the first-level harm is any worse just because I know about it. Maybe I decide I can’t work for someone like that, maybe not, but I’m glad that I know not to leave him alone with my puppies.
Now of course, this doesn’t mean it’s of prime importance to go around hunting for people’s dark secrets. It’s rarely necessary to know these things about someone to make good decisions on a day-to-day basis, the investigation is rarely worth the cost (both in terms of the effort required and the potential blow-ups from getting caught snooping around in the wrong places), and I care independently about not violating people’s privacy. But if you stipulate a situation where I could somehow learn something in a way that skips over these concerns, then sure, give me the dark secret!
The boss is already burning puppies, so it’s not like the first-level harm is any worse just because I know about it.
Knowing the dark secret will produce resentment for your boss. That resentment is likely to make it harder for you to get work done. If you see him with a big smile in the morning you won’t think:
“He seems like a nice guy because he’s smilling” but “Is he so happy because he burned puppies yesterday?”
Well, maybe. I’m actually skeptical that it would have much effect on my productivity. But to reverse the question, suppose you actually did know this about your boss. If you could snap your fingers and erase the knowledge from your brain, would you do it? Would you go on deleting all information that causes you to resent someone, so long as that information wasn’t visibly relevant to some other pending decision?
Deleting information doesn’t make emotions go away. Being afraid and not knowing the reason for being afraid is much worse than just being afraid.
You start to rationalize the emotions with bogus stories to get the emotions make sense.
The point is that in the real world … the general rule that your plans will go better when you use an accurate map doesn’t seem to admit of exceptions
Azatoth built you in such a way that having certain beliefs can screw you over, even when they’re true. (Well, I think it’s the aliefs that actually matter, but deliberately keeping aliefs and beliefs separate is an Advanced Technique.)
I would pick the black box, but it’s a hard choice. Given all the usual suppositions about Omega as a sufficiently trustworthy superintelligence, I would assume that the utilities really were as it said and take the false information. But it would be a painful, both because I want to be the kind of person who pursues and acts upon the truth, and also because I would be desperately curious to know what sort of true and non-misleading belief could cause that much disutility—was Lovecraft right after all? I’d probably try to bargain with Omega to let me know the true belief for just a minute before erasing it from my memory—but still, in the Least Convenient Possible World where my curiosity was never satisfied, I’d hold my nose and pick the black box.
Having answered the hypothetical, I’ll go on and say that I’m not sure there’s much to take from it. Clearly, I don’t value Truth for its own sake over and beyond all other considerations, let the heavens fall—but I never thought I did, and I doubt many here do. The point is that in the real world, where we don’t yet have trustworthy superintelligences, the general rule that your plans will go better when you use an accurate map doesn’t seem to admit of exceptions (and little though I understand Friendly AI, I’d be willing to bet that this rule holds post-singularity). Yes, there are times where you might be better off with a false belief, but you can’t predictably know in advance when that is, black swan blow-ups, etc.
To be more concrete, I don’t think there’s any real-world analogue to the hypothetical. If a consortium of the world’s top psychiatrists announced that, no really, believing in God makes people happier, more productive, more successful, etc., and that this conclusion holds even for firm atheists who work for years to argue themselves into knots of self-deception, and that this conclusion has the strongest sort of experimental support that you could expect in this field, I’d probably just shrug and say “I defy the data”. When it comes to purposeful self-deception, it really would take Omega to get me on board.
That’s exactly why the problem invokes Omega, yes. You need an awful lot of information to know which false beliefs actually are superior to the truth (and which facts might be harmful), and by the time you have it, it’s generally too late.
That said, the best real-world analogy that exists remains amnesia drugs. If you did have a traumatic experience, serious enough that you felt unable to cope with it, and you were experiencing PTSD or depression related to the trauma that impeded you from continuing with your life… but a magic pill could make it all go away, with no side effects, and with enough precision that you’d forget only the traumatic event… would you take the pill?
Okay, I suppose that probably is a more relevant question. The best answer I can give is that I would be extremely hesitant to do this. I’ve never experienced anything like this, so I’m open to the idea that there’s a pain here I simply can’t understand. But I would certainly want to work very hard to find a way to deal with the situation without erasing my memory, and I would expect to do better in the long-term because of it. Having any substantial part of my memory erased is a terrifying thought to me, as it’s really about the closest thing I can imagine to “experiencing” death.
But I also see a distinction between limiting your access to the truth for narrow, strategic reasons, and outright self-deception. There are all kinds of reasons one might want the truth withheld, especially when the withholding is merely a delay (think spoilers, the Bayesian Conspiracy, surprise parties for everyone except Alicorn, etc.). In those situations, I would still want to know that the truth was being kept for me, understand why it was being done, and most importantly, know under what circumstances it would be optimal to discover it.
So maybe amnesia drugs fit into that model. If all other solutions failed, I’d probably take them to make the nightmares stop, especially if I still had access to the memory and the potential to face it again when I was stronger. But I would still want to know there was something I blocked out and was unable to bear. What if the memory was lost forever and I could never even know that fact? That really does seem like part of me is dying, so choosing it would require the sort of pain that would make me wish for (limited) death—which is obviously pretty extreme, and probably more than I can imagine for a traumatic memory.
For some genotypes, more trauma is associated with lower levels of depression
Yet, someone experiencing trauma that they are better off continuing to suffer would hypothetically lead to learned helplessness and worse depression. But it’s true, yet false belief is more productive.
That said, genetic epidemiology is weird and I haven’t looked at the literature beyodndon’t understand the literature beyond this book. I was prompted to investigate it based on some counterintuitive outcomes regarding treatment for psychological trauama and depressive symptomology, established counterintuitive results about mindfulness and depressive symptoms in Parkinsons and Schizophrenia, and some disclosed SNP’s sequences from a known individual.
Nobody makes plans based on totally accurate maps. Good maps contain simplifications of reality to allow you to make better decisions. You start to teach children how atoms work by putting the image atoms as spheres into their heads. You don’t start by teaching them a model that’s up to date with the current scientific knowledge of how atoms works. The current model is more accurate but less useful for the children.
You calculate how airplanes fly with Newtons equations instead of using Einstein’s.
In social situations it can also often help to avoid getting certain information. You don’t have job. You ask a friend to get you a job. The job pays well. He assures you that the work you are doing helps the greater good of the world.
He however also tells you that some of the people you will work with do things in their private lifes that you don’t like.
Would you want him to tell you that your new boss secretly burns little puppies at night? The boss also doesn’t take it kindly if people critizise him for it.
Well, yes, I would. Of course, it’s not like he could actually say to me “your boss secretly burns puppies—do you want to know this or not?” But if he said something like “your boss has a dark and disturbing secret which might concern you; we won’t get in trouble just for talking about it, but he won’t take kindly to criticism—do you want me to tell you?”, then yeah, I would definitely want to know. The boss is already burning puppies, so it’s not like the first-level harm is any worse just because I know about it. Maybe I decide I can’t work for someone like that, maybe not, but I’m glad that I know not to leave him alone with my puppies.
Now of course, this doesn’t mean it’s of prime importance to go around hunting for people’s dark secrets. It’s rarely necessary to know these things about someone to make good decisions on a day-to-day basis, the investigation is rarely worth the cost (both in terms of the effort required and the potential blow-ups from getting caught snooping around in the wrong places), and I care independently about not violating people’s privacy. But if you stipulate a situation where I could somehow learn something in a way that skips over these concerns, then sure, give me the dark secret!
Knowing the dark secret will produce resentment for your boss. That resentment is likely to make it harder for you to get work done. If you see him with a big smile in the morning you won’t think: “He seems like a nice guy because he’s smilling” but “Is he so happy because he burned puppies yesterday?”
Well, maybe. I’m actually skeptical that it would have much effect on my productivity. But to reverse the question, suppose you actually did know this about your boss. If you could snap your fingers and erase the knowledge from your brain, would you do it? Would you go on deleting all information that causes you to resent someone, so long as that information wasn’t visibly relevant to some other pending decision?
Deleting information doesn’t make emotions go away. Being afraid and not knowing the reason for being afraid is much worse than just being afraid. You start to rationalize the emotions with bogus stories to get the emotions make sense.
Azatoth built you in such a way that having certain beliefs can screw you over, even when they’re true. (Well, I think it’s the aliefs that actually matter, but deliberately keeping aliefs and beliefs separate is an Advanced Technique.)