Rationality decomposes into instrumental rationality (‘winning’, or effectiveness; reliably achieving one’s goals) and epistemic rationality (accuracy; reliably forming true beliefs).
How do you understand ‘utilitarianism’? What are the things about it that you think are unimportant or counterproductive for systematic rationality? (I’ll hold off on asking about what things make utilitarianism unpopular or difficult to market, for the moment.)
Rationality decomposes into instrumental rationality (‘winning’, or effectiveness; reliably achieving one’s goals) and epistemic rationality (accuracy; reliably forming true beliefs).
And this need popularizing? You mean you’ll tell people “I can teach you how the world really works and how to win” and they run away screaming “Nooooo!” ? :-D
Yes, probably for most people. First, it sounds arrogant. Second, people underestimate the possibility of dramatically improved instrumental rationality. Third, a lot of people underestimate the desirability of dramatically improved epistemic rationality, and it’s especially hard to recognize that there are falsehoods you think you know. (As opposed to thinking there are truths you don’t know, which is easier to recognize but much more trivial.)
But that’s missing the point, methinks. Even if offering to teach people those things in the abstract were the easiest sell in the world, the specific tricks that actually add up to rationality are often difficult sells, and even when they’re easy sells in principle there’s insufficient research on how best to make that sell, and insufficient funding into, y’know, making it.
First, actually, comes credibility. You want to teach me how the world really works? Prove to me that your views are correct and not mistaken.You want to teach me how to win? Show me a million bucks in your bank account.
Even if offering to teach people those things in the abstract were the easiest sell in the world
Keep in mind that in non-LW terms you’re basically trying to teach people logic and science. The idea that by teaching common people logic and science the world can be made a better place is a very old idea, probably dating back to Enlightenment. It wasn’t an overwhelming success. It is probably an interesting and relevant question why not.
The idea that by teaching common people logic and science the world can be made a better place is a very old idea, probably dating back to Enlightenment. It wasn’t an overwhelming success. It is probably an interesting and relevant question why not.
Considering that, between then and now, we’ve had an Industrial Revolution in addition to many political ones, maybe it actually was?
It is probably an interesting and relevant question why not.
I agree; this is an idea I would like to hear someone else’s opinion on. My intuition is that teaching people logic and science has nothing to do with making them better people; it makes them more effective at whatever they want to do, max. Trying to teach “being a better person” has been attempted (for thousands of years in religious organizations), but maybe not enough in the same places/times as teaching science.
Also, the study of cognitive biases and how intuitions can be wrong is much more recent than the Enlightenment. Thinking that you know science and all your thoughts that feel right are right is dangerous.
You want to teach me how to win? Show me a million bucks in your bank account.
I guess CFAR should let Peter Thiel teach in their workshops. Or, more seriously, use his name (assuming he agrees with this) when promoting their workshops.
When I think about this more, there is a deeper objection. Something like this: “So, you believe you are super rational and super winning. And you are talking to me, although I don’t believe you, so you are wasting your time. Is that really the best use of your time? Why don’t you do something… I don’t know exactly what… but something thousand times more useful now, instead of talking to me? Because this kind of optimization is precisely the one you claim to be good at; obviously you’re not.”
And this is an objection that makes sense. I mean, it’s like if someone is trying to convince me that if I invest my money in his plan, my money will double… it’s not that I don’t believe in a possibility of doubling the money; it’s more like: why doesn’t this guy double his own money instead? -- Analogicaly, if you have superpowers that allow you to win, then why the heck are you not winning right now instead of talking to me?
EDIT: I guess we should reflect on our actions when we are trying to convince someone else about usefulness of rationality. I mean, if someone resists the idea of LW-style rationality, is it rational (is it winning on average) to spend my time trying to convince them, or should I just say “next” and move to another person? I mean, there are seven billion people on this planet, half million in the city where I live, so if one person does not like this idea, it’s not like my efforts are doomed… but I may doom them by wasting all my energy and time on trying to convince this specific person. Some people just aren’t interested, and that’s it.
Yep, that’s a valid and serious objection, especially in the utilitarian context.
A couple of ways to try to deal with it: (a) point out the difference between instrumentality and goals. (b) point out that rationality is not binary but a spectrum, it’s not a choice between winning all and winning nothing.
You can probably also reformulate the whole issue as “help you to deal with the life’s problems—let me show you how can you go about it without too much aggravation and hassle”...
I agree that’s an interesting and important question. If we’re looking for vaguely applicable academic terms for what’s being taught, ‘philosophy, mathematics and science’ is a better fit than ‘logic and science’, since it’s not completely obvious to me that traditional logic is very important to what we want to teach to the general public. A lot of the stuff it’s being proposed we teach is still poorly understood, and a lot of the well-understood stuff was not well-understood a hundred years ago, or even 50 years ago, or even 25 years ago. So history is a weak guide here; Enlightenment reformers shared a lot of our ideals but very little of our content.
‘philosophy, mathematics and science’ is a better fit than ‘logic and science’
I don’t agree. You want to teach philosophy as rationality? There are a great deal of different philosophies, which one will you teach? Or you’ll teach history of philosophy? Or meta-philosophy (which very quickly becomes yet-another-philosophy-in-the-long-list-of-those-which-tried-to-be-meta)?
And I really don’t see what math has to do with this at all. If anything, statistics is going to be more useful than math because statistics is basically a toolbox for dealing with uncertainty and that’s the really important part.
Philosophy includes epistemology, which is kind of important to epistemic ratioanlity.
Various philosophies include different approaches to epistemology. Which one do you want to teach?
I agree that philosophy can be a toolbox, but so can pretty much any field of human study—from physics to literary criticism. And here we’re talking about teaching rationality, not about the virtues of a comprehensive education.
In the usual way: a system of normative morality which focuses on outcomes (as opposed to means) and posits that the moral outcome is the one that maximizes utility (usually understood as happiness providing positive utility and suffering/unhappiness providing negative utility).
Rationality decomposes into instrumental rationality (‘winning’, or effectiveness; reliably achieving one’s goals) and epistemic rationality (accuracy; reliably forming true beliefs).
How do you understand ‘utilitarianism’? What are the things about it that you think are unimportant or counterproductive for systematic rationality? (I’ll hold off on asking about what things make utilitarianism unpopular or difficult to market, for the moment.)
And this need popularizing? You mean you’ll tell people “I can teach you how the world really works and how to win” and they run away screaming “Nooooo!” ? :-D
If I said that to some random stranger, I wouldn’t expect “noooooooo”, but I might expect “get in line”.
Yes, probably for most people. First, it sounds arrogant. Second, people underestimate the possibility of dramatically improved instrumental rationality. Third, a lot of people underestimate the desirability of dramatically improved epistemic rationality, and it’s especially hard to recognize that there are falsehoods you think you know. (As opposed to thinking there are truths you don’t know, which is easier to recognize but much more trivial.)
But that’s missing the point, methinks. Even if offering to teach people those things in the abstract were the easiest sell in the world, the specific tricks that actually add up to rationality are often difficult sells, and even when they’re easy sells in principle there’s insufficient research on how best to make that sell, and insufficient funding into, y’know, making it.
How do you know that people “underestimate the desirability of dramatically improved epistemic rationality”?
Yvain (and others) have argued that people around here make precisely the opposite mistake.
Or maybe there’s a lot of utility in not coming accross geeky and selfish, so they are already being instementally rational.
First, actually, comes credibility. You want to teach me how the world really works? Prove to me that your views are correct and not mistaken.You want to teach me how to win? Show me a million bucks in your bank account.
Keep in mind that in non-LW terms you’re basically trying to teach people logic and science. The idea that by teaching common people logic and science the world can be made a better place is a very old idea, probably dating back to Enlightenment. It wasn’t an overwhelming success. It is probably an interesting and relevant question why not.
Considering that, between then and now, we’ve had an Industrial Revolution in addition to many political ones, maybe it actually was?
I agree; this is an idea I would like to hear someone else’s opinion on. My intuition is that teaching people logic and science has nothing to do with making them better people; it makes them more effective at whatever they want to do, max. Trying to teach “being a better person” has been attempted (for thousands of years in religious organizations), but maybe not enough in the same places/times as teaching science.
Also, the study of cognitive biases and how intuitions can be wrong is much more recent than the Enlightenment. Thinking that you know science and all your thoughts that feel right are right is dangerous.
Correct, but that’s what spreading the rationality into the masses aims to accomplish, no?
I don’t think teaching people rationality implies giving them a new and improved value system.
I guess CFAR should let Peter Thiel teach in their workshops. Or, more seriously, use his name (assuming he agrees with this) when promoting their workshops.
When I think about this more, there is a deeper objection. Something like this: “So, you believe you are super rational and super winning. And you are talking to me, although I don’t believe you, so you are wasting your time. Is that really the best use of your time? Why don’t you do something… I don’t know exactly what… but something thousand times more useful now, instead of talking to me? Because this kind of optimization is precisely the one you claim to be good at; obviously you’re not.”
And this is an objection that makes sense. I mean, it’s like if someone is trying to convince me that if I invest my money in his plan, my money will double… it’s not that I don’t believe in a possibility of doubling the money; it’s more like: why doesn’t this guy double his own money instead? -- Analogicaly, if you have superpowers that allow you to win, then why the heck are you not winning right now instead of talking to me?
EDIT: I guess we should reflect on our actions when we are trying to convince someone else about usefulness of rationality. I mean, if someone resists the idea of LW-style rationality, is it rational (is it winning on average) to spend my time trying to convince them, or should I just say “next” and move to another person? I mean, there are seven billion people on this planet, half million in the city where I live, so if one person does not like this idea, it’s not like my efforts are doomed… but I may doom them by wasting all my energy and time on trying to convince this specific person. Some people just aren’t interested, and that’s it.
Yep, that’s a valid and serious objection, especially in the utilitarian context.
A couple of ways to try to deal with it: (a) point out the difference between instrumentality and goals. (b) point out that rationality is not binary but a spectrum, it’s not a choice between winning all and winning nothing.
You can probably also reformulate the whole issue as “help you to deal with the life’s problems—let me show you how can you go about it without too much aggravation and hassle”...
I agree that’s an interesting and important question. If we’re looking for vaguely applicable academic terms for what’s being taught, ‘philosophy, mathematics and science’ is a better fit than ‘logic and science’, since it’s not completely obvious to me that traditional logic is very important to what we want to teach to the general public. A lot of the stuff it’s being proposed we teach is still poorly understood, and a lot of the well-understood stuff was not well-understood a hundred years ago, or even 50 years ago, or even 25 years ago. So history is a weak guide here; Enlightenment reformers shared a lot of our ideals but very little of our content.
I don’t agree. You want to teach philosophy as rationality? There are a great deal of different philosophies, which one will you teach? Or you’ll teach history of philosophy? Or meta-philosophy (which very quickly becomes yet-another-philosophy-in-the-long-list-of-those-which-tried-to-be-meta)?
And I really don’t see what math has to do with this at all. If anything, statistics is going to be more useful than math because statistics is basically a toolbox for dealing with uncertainty and that’s the really important part.
Philosophy includes epistemology, which is kind of important to epistemic ratioanlity.
Philosophy is a toolbox as well as a set of doctrines.
Various philosophies include different approaches to epistemology. Which one do you want to teach?
I agree that philosophy can be a toolbox, but so can pretty much any field of human study—from physics to literary criticism. And here we’re talking about teaching rationality, not about the virtues of a comprehensive education.
The Enlightenment? Try Ancient Greece.
I don’t think the Greeks aimed to teach hoi polloi logic and science. They were the province of a select group of philosophers.
(Pedantic upvote for not saying “the hoi polloi”.)
In the usual way: a system of normative morality which focuses on outcomes (as opposed to means) and posits that the moral outcome is the one that maximizes utility (usually understood as happiness providing positive utility and suffering/unhappiness providing negative utility).