I have yet to read the Bayes Theorem article. I understand that it is very much a prerequisite article for many of the others, yet I simply have not. It is a very long and complicated article, and would take a significant time and intellectual investment in order to read. Procrastination has always been my single greatest flaw, one I struggle with every day. I recognize the importance of reading it, but that’s only a belief. It seems very hard to integrate the importance of rationality to a level where I can really deep-down feel its importance. When I think or act irrationally, I recognize that I am doing so, and I consciously recognize that it is wrong. Yet, I find it hard to strongly feel it is so. It seems one needs to understand the importance of rationality intuitively before it can be applied. The elephant must first give the rider some control before he can do anything. Do you have any advice concerning that?
As for integrating the importance of rationality I scarcely know where to begin; it’s a large topic. First and foremost read this. Secondly realise how important opinions are, and that it’s not okay to “have your own opinion” as schools will common condition their students to believe. That’s not to say don’t have different opinions to other people when you have a (justified) belief that your opinion reflects reality better, it’s to say it’s not okay to have false opinions. One of the reasons reading An Intuitive Explanation is important is that it helps convey the idea that your beliefs should correspond exactly to what you expect from reality.
It’s hard to put into context why this is so important, but think if you will back to the cause of the WW2 Holocaust. All of that happened because of what people believed. To get a taste of how bad that sort of thing is look into current affairs, things happening in Syria, Bahrain, and so on. Religious extremists are another example of this, as are racists. All of these “evil” people are not inherently evil, they just have beliefs that make them think that what they’re doing is right.
All of that injustice done, all of those people hurt and killed because people are not inherently rational. You could end up as one of those victims, but the worst part is you could be the one doing the evil and not even know it. It might help to read this.
On a separate note, regarding emotions and acting irrationally because of them the best thing is to reduce them. Either you’ll find that the emotion is flagging up a real problem and you can take the appropriate action, or they’ll end up dissipating.
Sorry that turned into a bit of an essay but it is a big topic and my experience of explaining it is limited. If you’ve got questions or anything I’m more than happy to answer/try to answer them. Hope all this is helpful.
And my bad, I’ve only ever encountered your name in relation to MGS, my apologies for that.
Thanks for your input, it was quite enlightening. I especially appreciate the Common Sense Atheism post. That’s a wonderful blog and what originally led me to this site, but I had no idea that article was on there.
Concerning what you said about the Holocaust and such, that had actually occurred to me before, but in a different manner. I reasoned that even if I felt 99% certain that my moral beliefs were accurate, there was that 1% chance that they could be wrong. Hitler may well have felt 99% certain that he was correct. I became to afraid to really do much of anything. I thought, “What if it is in some weird way the utmost evil to not kill millions of people? It seems unlikely, but it seemed unlikely to Hitler that he was in the wrong. What if somehow similarly it is wrong to try to ascertain what is right? What if rationality is somehow immoral?”
Of course I never actually consciously thought that was true, but I fear my subconscious still believes it. That is my greatest debilitation, that lingering uncertainty. I now consciously hold the idea that it is at least better to try and be right than to not try at all, that it would be better to be Hitler than to be 40 years old and living with my mom, but my subconscious still hasn’t accepted that.
I believe that is why I have difficulty integrating rationality. Some part of my mind somewhere says, “But what if this is wrong? What if this is evil? You’re only 99.99999999% certain. What if religious fundamentalism is the only moral choice?”
It’s not a rhetorical question, you know. What happens if you try to answer it?
I have a pill in my hand. I’m .99 confident that, if I take it, it will grant me a thousand units of something valuable. (It doesn’t matter for our purposes right now what that unit is. We sometimes call it “utilons” around here, just for the sake of convenient reference.) But there’s also a .01 chance that it will instead take away ten thousand utilons. What should I do?
It’s called reasoning under uncertainty, and humans aren’t very good at it naturally. Personally, my instinct is to either say “well, it’s almost certain to have a good effect, so I’ll take the pill” or “well, it would be really bad if it had a bad effect, so I won’t take the pill”, and lots of studies show that which of those I say can be influenced by all kinds of things that really have nothing to do with which choice leaves me better off.
One way to approach problems like this is by calculating expected values. Taking the pill gives me a .99 chance of 1000 utilons, and a .01 chance of −10000 utilons; the expected value is therefore .99 1000 - .01 10000 = 990 − 100; the result is positive, so I should take the pill. If I instead estimated a .9 chance of upside and a .1 chance of downside, the EV calculation would be 99 − 1000; negative result, so I shouldn’t take the pill.
There are weaknesses to that approach, but it has definite advantages relative to the one that’s wired into my brain in a lot of cases.
The same principle applies if I estimate a .99 chance that by adopting the ideology in my hand, I will make better choices, and a .01 chance that adopting that ideology will lead me to do evil things instead.
Of course, what that means is that there’s a huge difference between being 99% certain and being 99.99999999% certain. It means that there’s a huge difference between being mistaken in a way that kills millions of people, and being mistaken in a way that kills ten people. It means that it’s not enough to say “that’s good” or “that’s evil”; I actually have to do the math, which takes effort. That’s an offputting proposition; it’s far simpler to stick with my instinctive analysis, even if it’s less useful.
At some point, the question becomes whether I feel like making that effort.
Well the thing about probabilities (in Bayesian statistics) is that they represent the amount of evidence you have for the true state of reality. In general being 50% certain means you have no evidence for your belief, less that 50% means you have evidence against it and greater than 50% means you have evidence for it. You’ll get to it as you read more of An Intuitive Explanation.
The important thing to note is that to be 99% certain something is true as a rationalist you actually have to have evidence for it being true. Rather than feeling that you’re 99% certain, Bayes theorem allows you to see how much evidence you actually have in a purely quantitative way. That’s why there’s so much talk of “calibration” here, it’s an attempt at aligning the feeling of how certain you are with how certain the evidence says you should be.
You can also work out the expected value of what your actions would be if you are wrong. For Hitler, if he thought there was a 1% chance of him being wrong he could work out the expected number of wasted lives as 0.01*11,000,000 which is 110,000 (and that’s using the lower bound of people killed during the holocaust). Hence, if I were Hitler, I wouldn’t risk instigating the holocaust until I had much more information/evidence. Being rational is about looking at the way the world is and acting based on that.
The point is, the most moral thing to do is the most likely thing to be moral. If God turns out to exist (although there are masses of evidence against that) and he asks you why you weren’t a religious fundamentalist, you’ll have a damn good answer.
I have yet to read the Bayes Theorem article. I understand that it is very much a prerequisite article for many of the others, yet I simply have not. It is a very long and complicated article, and would take a significant time and intellectual investment in order to read. Procrastination has always been my single greatest flaw, one I struggle with every day. I recognize the importance of reading it, but that’s only a belief. It seems very hard to integrate the importance of rationality to a level where I can really deep-down feel its importance. When I think or act irrationally, I recognize that I am doing so, and I consciously recognize that it is wrong. Yet, I find it hard to strongly feel it is so. It seems one needs to understand the importance of rationality intuitively before it can be applied. The elephant must first give the rider some control before he can do anything. Do you have any advice concerning that?
Oh and Raiden is my actual name.
I can’t say I blame you for not reading it; it took me about three months to get through it! However Common Sense Atheism has An Intuitive Explanation of Eliezer Yudkowsky’s Intuitive Explanation of Bayes’ Theorem, it’s much easier to read and explains many of the bits that Eliezer skips over.
As for integrating the importance of rationality I scarcely know where to begin; it’s a large topic. First and foremost read this. Secondly realise how important opinions are, and that it’s not okay to “have your own opinion” as schools will common condition their students to believe. That’s not to say don’t have different opinions to other people when you have a (justified) belief that your opinion reflects reality better, it’s to say it’s not okay to have false opinions. One of the reasons reading An Intuitive Explanation is important is that it helps convey the idea that your beliefs should correspond exactly to what you expect from reality.
It’s hard to put into context why this is so important, but think if you will back to the cause of the WW2 Holocaust. All of that happened because of what people believed. To get a taste of how bad that sort of thing is look into current affairs, things happening in Syria, Bahrain, and so on. Religious extremists are another example of this, as are racists. All of these “evil” people are not inherently evil, they just have beliefs that make them think that what they’re doing is right.
All of that injustice done, all of those people hurt and killed because people are not inherently rational. You could end up as one of those victims, but the worst part is you could be the one doing the evil and not even know it. It might help to read this.
On a separate note, regarding emotions and acting irrationally because of them the best thing is to reduce them. Either you’ll find that the emotion is flagging up a real problem and you can take the appropriate action, or they’ll end up dissipating.
Sorry that turned into a bit of an essay but it is a big topic and my experience of explaining it is limited. If you’ve got questions or anything I’m more than happy to answer/try to answer them. Hope all this is helpful.
And my bad, I’ve only ever encountered your name in relation to MGS, my apologies for that.
Thanks for your input, it was quite enlightening. I especially appreciate the Common Sense Atheism post. That’s a wonderful blog and what originally led me to this site, but I had no idea that article was on there.
Concerning what you said about the Holocaust and such, that had actually occurred to me before, but in a different manner. I reasoned that even if I felt 99% certain that my moral beliefs were accurate, there was that 1% chance that they could be wrong. Hitler may well have felt 99% certain that he was correct. I became to afraid to really do much of anything. I thought, “What if it is in some weird way the utmost evil to not kill millions of people? It seems unlikely, but it seemed unlikely to Hitler that he was in the wrong. What if somehow similarly it is wrong to try to ascertain what is right? What if rationality is somehow immoral?”
Of course I never actually consciously thought that was true, but I fear my subconscious still believes it. That is my greatest debilitation, that lingering uncertainty. I now consciously hold the idea that it is at least better to try and be right than to not try at all, that it would be better to be Hitler than to be 40 years old and living with my mom, but my subconscious still hasn’t accepted that.
I believe that is why I have difficulty integrating rationality. Some part of my mind somewhere says, “But what if this is wrong? What if this is evil? You’re only 99.99999999% certain. What if religious fundamentalism is the only moral choice?”
It’s not a rhetorical question, you know. What happens if you try to answer it?
I have a pill in my hand. I’m .99 confident that, if I take it, it will grant me a thousand units of something valuable. (It doesn’t matter for our purposes right now what that unit is. We sometimes call it “utilons” around here, just for the sake of convenient reference.) But there’s also a .01 chance that it will instead take away ten thousand utilons. What should I do?
It’s called reasoning under uncertainty, and humans aren’t very good at it naturally. Personally, my instinct is to either say “well, it’s almost certain to have a good effect, so I’ll take the pill” or “well, it would be really bad if it had a bad effect, so I won’t take the pill”, and lots of studies show that which of those I say can be influenced by all kinds of things that really have nothing to do with which choice leaves me better off.
One way to approach problems like this is by calculating expected values. Taking the pill gives me a .99 chance of 1000 utilons, and a .01 chance of −10000 utilons; the expected value is therefore .99 1000 - .01 10000 = 990 − 100; the result is positive, so I should take the pill. If I instead estimated a .9 chance of upside and a .1 chance of downside, the EV calculation would be 99 − 1000; negative result, so I shouldn’t take the pill.
There are weaknesses to that approach, but it has definite advantages relative to the one that’s wired into my brain in a lot of cases.
The same principle applies if I estimate a .99 chance that by adopting the ideology in my hand, I will make better choices, and a .01 chance that adopting that ideology will lead me to do evil things instead.
Of course, what that means is that there’s a huge difference between being 99% certain and being 99.99999999% certain. It means that there’s a huge difference between being mistaken in a way that kills millions of people, and being mistaken in a way that kills ten people. It means that it’s not enough to say “that’s good” or “that’s evil”; I actually have to do the math, which takes effort. That’s an offputting proposition; it’s far simpler to stick with my instinctive analysis, even if it’s less useful.
At some point, the question becomes whether I feel like making that effort.
Glad to be of help!
Well the thing about probabilities (in Bayesian statistics) is that they represent the amount of evidence you have for the true state of reality. In general being 50% certain means you have no evidence for your belief, less that 50% means you have evidence against it and greater than 50% means you have evidence for it. You’ll get to it as you read more of An Intuitive Explanation.
The important thing to note is that to be 99% certain something is true as a rationalist you actually have to have evidence for it being true. Rather than feeling that you’re 99% certain, Bayes theorem allows you to see how much evidence you actually have in a purely quantitative way. That’s why there’s so much talk of “calibration” here, it’s an attempt at aligning the feeling of how certain you are with how certain the evidence says you should be.
You can also work out the expected value of what your actions would be if you are wrong. For Hitler, if he thought there was a 1% chance of him being wrong he could work out the expected number of wasted lives as 0.01*11,000,000 which is 110,000 (and that’s using the lower bound of people killed during the holocaust). Hence, if I were Hitler, I wouldn’t risk instigating the holocaust until I had much more information/evidence. Being rational is about looking at the way the world is and acting based on that.
The point is, the most moral thing to do is the most likely thing to be moral. If God turns out to exist (although there are masses of evidence against that) and he asks you why you weren’t a religious fundamentalist, you’ll have a damn good answer.