If by “spreading rationality” you mean spreading LW material and ideas, then a potential problem is that it causes many people to donate their money to AI friendliness research instead of to malaria nets. Although these people consider this to be “effective altruism”, as an AI skeptic it’s not clear to me that this is significantly more effective than, say, donating money to cancer research (as non-EA people often do).
My goal is convincing people to have more clear and rational, evidence-thinking, as informed by LW materials. Some people may choose to donate to AI, and others to EA—as you can see from the blog I cited, I specifically highlight the benefits of the EA movement. Regardless, as Brian Tomasik points out, helping people be more rational contributes to improving the world, and thus the ultimate goal of the EA movement.
Regardless, as Brian Tomasik points out, helping people be more rational contributes to improving the world, and thus the ultimate goal of the EA movement.
I agree that increasing rationality would improve the world, but would it improve the world more than other efforts? I believe you will face stiff competition from MIRI for effective altruist’s charitable donations. From the Brian Tomasik essay you referenced…
…because AI is likely to control the future of Earth’s light cone absent a catastrophe before then, ultimately all other applications matter through their influence on AI.
Separately…
Is encouraging philosophical reflection in general plausibly competitive with more direct work to explore the philosophical consequences of AI? My guess is that direct work like MIRI’s is more important per dollar.
Why should I support Intentional Insights instead of MIRI? I’m sure I won’t be the only potential donor to ask this question, so I recommend that you craft a solid response.
My goal is convincing people to have more clear and rational, evidence-thinking, as informed by LW materials.
Is there an objective measure by which LW materials inform more “clear and rational” thought? Can you define “clear and rational”? Or actually, to use LW terminology, can you taboo “clear” and “rational” and restate your point?
Regardless, as Brian Tomasik points out, helping people be more rational contributes to improving the world, and thus the ultimate goal of the EA movement.
But does it contribute to improving the world in an effective way?
Well, I’d say that “clear and rational” is the same as “arriving at the correct answer to make the best decision to refine and achieve goals.” So yes, I would say it does contribute to improving the world in an effective way, because helping people both understand their goals better (refine goals) and then achieve their goals helps people have better lives and thus improves flourishing.
Do you have any evidence that LW materials help people refine and achieve their goals?
Helping people refine and achieve their goals is pretty damn difficult: school boards, psychiatrists, and welfare programs have been trying to do this for decades. For example, are you saying that teaching LW material in schools will improve student outcomes? I would bet very strongly against such a prediction.
Yup, I’m aware of Scott’s dislike of the growth mindset hypothesis, he’s a bit on the extreme spectrum on that one. However, even in the post itself, he notes that there are several studies that show the benefits of teaching students to be goal oriented. There’s lots of research out there that teaching students metacognition is helpful, for example this chapter cites a lot of studies. I’d say that overall the probabilistic evidence supports the hypothesis that teaching people to be goal oriented and self-reflective about their ways of achieving their goals will help them have better results in achieving those goals.
Okay, let’s suppose for a second that I buy that teaching students to be goal oriented helps them significantly. That still leaves quite a few questions:
Many school boards already try to teach students to be goal oriented. Certainly “list out realistic goals” was said to me countless times in my own schooling. What do you plan to do differently?
I plan to teach students actually how to be goal oriented. It’s the difference between telling people “lose weight” and specifically giving them clear instructions for how to do is. Here is an example of how I do so in a videotaped workshop.
I would like to have an experimental attitude to LW content, and will look forward to see the results of my experiments. I don’t intend to do the extreme rationality stuff, and expect more of it than it can deliver. We’ll see, I guess :-)
If by “spreading rationality” you mean spreading LW material and ideas, then a potential problem is that it causes many people to donate their money to AI friendliness research instead of to malaria nets. Although these people consider this to be “effective altruism”, as an AI skeptic it’s not clear to me that this is significantly more effective than, say, donating money to cancer research (as non-EA people often do).
My goal is convincing people to have more clear and rational, evidence-thinking, as informed by LW materials. Some people may choose to donate to AI, and others to EA—as you can see from the blog I cited, I specifically highlight the benefits of the EA movement. Regardless, as Brian Tomasik points out, helping people be more rational contributes to improving the world, and thus the ultimate goal of the EA movement.
I agree that increasing rationality would improve the world, but would it improve the world more than other efforts? I believe you will face stiff competition from MIRI for effective altruist’s charitable donations. From the Brian Tomasik essay you referenced…
Separately…
Why should I support Intentional Insights instead of MIRI? I’m sure I won’t be the only potential donor to ask this question, so I recommend that you craft a solid response.
Excellent, thank you for the feedback on what to craft! I will think about this further, and appreciate the ideas!
Is there an objective measure by which LW materials inform more “clear and rational” thought? Can you define “clear and rational”? Or actually, to use LW terminology, can you taboo “clear” and “rational” and restate your point?
But does it contribute to improving the world in an effective way?
Well, I’d say that “clear and rational” is the same as “arriving at the correct answer to make the best decision to refine and achieve goals.” So yes, I would say it does contribute to improving the world in an effective way, because helping people both understand their goals better (refine goals) and then achieve their goals helps people have better lives and thus improves flourishing.
Do you have any evidence that LW materials help people refine and achieve their goals?
Helping people refine and achieve their goals is pretty damn difficult: school boards, psychiatrists, and welfare programs have been trying to do this for decades. For example, are you saying that teaching LW material in schools will improve student outcomes? I would bet very strongly against such a prediction.
There’s actually quite a bit of evidence on how helping students refine and achieve their goals helps them learn better, for example here.
There’s also quite a bit of reason to be skeptical of that evidence. Here’s slatestarcodex’s take: http://slatestarcodex.com/2015/03/11/too-good-to-be-true/
Yup, I’m aware of Scott’s dislike of the growth mindset hypothesis, he’s a bit on the extreme spectrum on that one. However, even in the post itself, he notes that there are several studies that show the benefits of teaching students to be goal oriented. There’s lots of research out there that teaching students metacognition is helpful, for example this chapter cites a lot of studies. I’d say that overall the probabilistic evidence supports the hypothesis that teaching people to be goal oriented and self-reflective about their ways of achieving their goals will help them have better results in achieving those goals.
Okay, let’s suppose for a second that I buy that teaching students to be goal oriented helps them significantly. That still leaves quite a few questions:
Many school boards already try to teach students to be goal oriented. Certainly “list out realistic goals” was said to me countless times in my own schooling. What do you plan to do differently?
There seems to be no evidence at all that LW material is better for life outcomes than any other self-help program, and some evidence that it’s worse. Consider this post (again by Scott): http://lesswrong.com/lw/9p/extreme_rationality_its_not_that_great/
I plan to teach students actually how to be goal oriented. It’s the difference between telling people “lose weight” and specifically giving them clear instructions for how to do is. Here is an example of how I do so in a videotaped workshop.
I would like to have an experimental attitude to LW content, and will look forward to see the results of my experiments. I don’t intend to do the extreme rationality stuff, and expect more of it than it can deliver. We’ll see, I guess :-)