I stopped being a theist a few years ago. That was due more to what Less Wrong people would call “traditional rationalism” than the sort often advocated here (I actually identify as closer to a traditionalist rationalist than a strict Bayesianism but I suspect that the level of disagreement is smaller than Eliezer makes it out to be). And part of this was certainly also emotional reactions to having the theodicy problem thrown in my face rather than direct logic.
One major update that occurred when I first took intro psych was realizing how profoundly irrational the default human thinking processes were. Before then, my general attitude was very close to humans as the rational animal. I’m not sure how relevant that is, since that’s saying something like “learning about biases taught me that we are biased.” I don’t know if that’s very helpful.
My political views have updated a lot on a variety of different issues. But I suspect that some of those are due to spending time with people who have those views rather than actually getting relevant evidence.
I’ve updated on how dangerous extreme theism is. It may sound strange, but this didn’t arise as much out of things like terrorism, but rather becoming more aware of how many strongly held beliefs about the nature of the world there were out there that were motivated by religion and utterly at odds with reality. This was not about evolution which even in my religious phases I understood and was annoyed at by the failure of religious compatriots to understand. Rather this has included geocentrism among the Abrahamic religions, flat-Earthism among some Islamic extremists, spontaneous generation among ultra-Orthodox Jews (no really. Not a joke. And not even microscopic spontaneous generation but spontaneous generation of mice), belief among some ultra-Orthodox Jews that the kidneys are the source of moral guidance (which they use as an argument against kidney transplants).
My three most recent major updates (last six months or so) are 1) Thinking that cryonics has a substantial success probability (although I still think it is very low). This came not from actually learning more about rationality, but rather after reading some of the stuff here going back and trying to find out more about cryonics. Learning that the ice formation problem is close to completely solved substantially changed my attitude. 2) Deciding that there’s a high chance that we’ll have space elevators before we have practical fusion power. (This is a less trivial observation than one might think since once one has a decent space elevator it becomes pretty cheap to put up solar power satelites). This is to some extent a reevaluation based primarily on time-frames given by relevant experts. 3) Deciding that there’s a substantial chance that P=NP may undecidable in ZFC. This update occurred because I was reading about how complexity results can be connected to provability of certain classes of statements in weakened forms of the Peano axioms. That makes this sound more potentially like it might be in a class of problems that have decent reasons for being undecidable.
I’m not sure how relevant that is, since that’s saying something like “learning about biases taught me that we are biased.” I don’t know if that’s very helpful.
It is!
I am repeatedly surprised about a) basic level insights that are not wide spread and b) insights that other people consider basic that I do not have c) applications of an idea i understand in an area I did not think of applying it too
To list a few:
People are biased ⇒ I am biased!
Change is possible
Understanding is possible
I am a brain in a vat.
Real life rocks :-)
Even after learning about cached thought, happy death and many others I still managed to fall into the traps of those.
So i consider it helpful to see where someone applies biases.
My political views have updated a lot on a variety of different issues. But I suspect that some of those are due to spending time with people who have those views rather than actually getting relevant evidence.
That statement in itself looks like a warning sign.
That statement in itself looks like a warning sign.
Yeah, being aware that there are biases at play doesn’t always mean I’m at all sure I’m able to correct for all of them. The problem is made more complicated by the fact that for each of the views in questions, I can point to new information leading to the updates. But I don’t know if in general that’s the actual cause of the updates.
I stopped being a theist a few years ago. That was due more to what Less Wrong people would call “traditional rationalism” than the sort often advocated here (I actually identify as closer to a traditionalist rationalist than a strict Bayesianism but I suspect that the level of disagreement is smaller than Eliezer makes it out to be). And part of this was certainly also emotional reactions to having the theodicy problem thrown in my face rather than direct logic.
One major update that occurred when I first took intro psych was realizing how profoundly irrational the default human thinking processes were. Before then, my general attitude was very close to humans as the rational animal. I’m not sure how relevant that is, since that’s saying something like “learning about biases taught me that we are biased.” I don’t know if that’s very helpful.
My political views have updated a lot on a variety of different issues. But I suspect that some of those are due to spending time with people who have those views rather than actually getting relevant evidence.
I’ve updated on how dangerous extreme theism is. It may sound strange, but this didn’t arise as much out of things like terrorism, but rather becoming more aware of how many strongly held beliefs about the nature of the world there were out there that were motivated by religion and utterly at odds with reality. This was not about evolution which even in my religious phases I understood and was annoyed at by the failure of religious compatriots to understand. Rather this has included geocentrism among the Abrahamic religions, flat-Earthism among some Islamic extremists, spontaneous generation among ultra-Orthodox Jews (no really. Not a joke. And not even microscopic spontaneous generation but spontaneous generation of mice), belief among some ultra-Orthodox Jews that the kidneys are the source of moral guidance (which they use as an argument against kidney transplants).
My three most recent major updates (last six months or so) are 1) Thinking that cryonics has a substantial success probability (although I still think it is very low). This came not from actually learning more about rationality, but rather after reading some of the stuff here going back and trying to find out more about cryonics. Learning that the ice formation problem is close to completely solved substantially changed my attitude. 2) Deciding that there’s a high chance that we’ll have space elevators before we have practical fusion power. (This is a less trivial observation than one might think since once one has a decent space elevator it becomes pretty cheap to put up solar power satelites). This is to some extent a reevaluation based primarily on time-frames given by relevant experts. 3) Deciding that there’s a substantial chance that P=NP may undecidable in ZFC. This update occurred because I was reading about how complexity results can be connected to provability of certain classes of statements in weakened forms of the Peano axioms. That makes this sound more potentially like it might be in a class of problems that have decent reasons for being undecidable.
It is! I am repeatedly surprised about a) basic level insights that are not wide spread and b) insights that other people consider basic that I do not have c) applications of an idea i understand in an area I did not think of applying it too
To list a few: People are biased ⇒ I am biased! Change is possible Understanding is possible I am a brain in a vat. Real life rocks :-)
Even after learning about cached thought, happy death and many others I still managed to fall into the traps of those.
So i consider it helpful to see where someone applies biases.
That statement in itself looks like a warning sign.
Yeah, being aware that there are biases at play doesn’t always mean I’m at all sure I’m able to correct for all of them. The problem is made more complicated by the fact that for each of the views in questions, I can point to new information leading to the updates. But I don’t know if in general that’s the actual cause of the updates.