Well, here are my assessments of rationalities weakest points, from what I have read on Less Wrong so far. (That means some of these use “Rationality” when “Less Wrong” may be better used, which could be a crippling flaw in my list of weaknesses.) It sounds like you may be looking for something like these:
1: Several aspects of rationality require what appears to be a significant amount of math and philosophy to get right. If you don’t understand that math or the philosophy, you aren’t really being a rationalist, you’re more just following the lead of other rationalists because they seem smart, or possibly because you liked their books, and possibly cheerleading them on on occasion. Rationality needs more simpler explanations that can be easily followed by people who are not in the top 1% of brain processing.
2: Rationality also requires quick explanations. Some people who study rationality realize this: When attempting to automate their decision theory, consider the problem “How do we make a computer that doesn’t turn the entire universe into more computer to be absolutely sure that 2+2 is 4?” is considered a substantial problem. Quick answers tend to be eschewed, for ever more levels of clarity which take an increasing large amount of time, and even when confronted with the obvious problem that going for nothing but clarity causes, Rationalists consider it to be something that requires even more research into how to be clear.
3: Rationality decision problems really don’t rise above the level of religion. Consider that in many rationality decision problems, the first thing that Rationalists do is presuppose “Omega” who is essentially “God” with the serial numbers filed off. Infinite (Or extremely high) utility and disutility are thrown around like so many parallels of Heaven and Hell. This makes a lot of rationality problems the kind of thing that those boring philosophers of the past (that Rationalists are so quick to eschew) have discussed ad nauseum.
It’s hard to really grasp the scope of these problems (I’m one of those people in part 1 that doesn’t quite get some of the mathier bits sometimes.) And I’m not sure any of them are fatal to rationality as a decision making method, since I still read the site and consider it trustworthy. But if you were going to look for weak points, you could start at any of these.
Leaving aside the rest of this comment, please note that in many cases we throw around large numbers and high probabilities in order to obviously break fragile systems that wouldn’t break as obviously if we threw small numbers and middle probabilities.
Well, here are my assessments of rationalities weakest points, from what I have read on Less Wrong so far. (That means some of these use “Rationality” when “Less Wrong” may be better used, which could be a crippling flaw in my list of weaknesses.) It sounds like you may be looking for something like these:
1: Several aspects of rationality require what appears to be a significant amount of math and philosophy to get right. If you don’t understand that math or the philosophy, you aren’t really being a rationalist, you’re more just following the lead of other rationalists because they seem smart, or possibly because you liked their books, and possibly cheerleading them on on occasion. Rationality needs more simpler explanations that can be easily followed by people who are not in the top 1% of brain processing.
2: Rationality also requires quick explanations. Some people who study rationality realize this: When attempting to automate their decision theory, consider the problem “How do we make a computer that doesn’t turn the entire universe into more computer to be absolutely sure that 2+2 is 4?” is considered a substantial problem. Quick answers tend to be eschewed, for ever more levels of clarity which take an increasing large amount of time, and even when confronted with the obvious problem that going for nothing but clarity causes, Rationalists consider it to be something that requires even more research into how to be clear.
3: Rationality decision problems really don’t rise above the level of religion. Consider that in many rationality decision problems, the first thing that Rationalists do is presuppose “Omega” who is essentially “God” with the serial numbers filed off. Infinite (Or extremely high) utility and disutility are thrown around like so many parallels of Heaven and Hell. This makes a lot of rationality problems the kind of thing that those boring philosophers of the past (that Rationalists are so quick to eschew) have discussed ad nauseum.
It’s hard to really grasp the scope of these problems (I’m one of those people in part 1 that doesn’t quite get some of the mathier bits sometimes.) And I’m not sure any of them are fatal to rationality as a decision making method, since I still read the site and consider it trustworthy. But if you were going to look for weak points, you could start at any of these.
Does that help?
Leaving aside the rest of this comment, please note that in many cases we throw around large numbers and high probabilities in order to obviously break fragile systems that wouldn’t break as obviously if we threw small numbers and middle probabilities.
That makes intuitive sense to me, since I’ve worked in programming. Thanks!