Trying to teach someone to think rationally is a long process—maybe even impossible for some people.
This is not incompatible with marketing persay—marketing is about advocacy, not teaching. And pretty much all effective advocacy has to be targeted to System 1 - the “heart” or the “gut”—in fairly direct terms.
To me, it seems that CFAR was supposed to be working on this sort of stuff, and they have not accomplished all that much. So I think, in a way, we should be welcoming the fact that Gleb T./International Insights are now trying to fill this void. Maybe they aren’t doing it very well at this time, but that’s a separate matter.
To me, it seems that CFAR was supposed to be working on this sort of stuff, and they have not accomplished all that much.
CFAR mission wasn’t marketing but actually teaching people to be more rational in a way that helps them actually be more rational.
As far as I understand they are making progress on that goal and the workshops they have now are better than at the beginning.
As rationalists we have responsibility for actually giving useful advice. CFAR’s approach of first focusing on figuring out what’s useful advice, instead of first focusing on marketing is good.
CFAR’s approach of first focusing on figuring out what’s useful advice, instead of first focusing on marketing is good.
Sounds like false dilemma. How about splitting CFAR into two groups? The first group would keep inventing better and better advice (more or less what CFAR is doing now). The second group would take the current results of their research, and try to deliver it to as many people as possible. The second group would also do the marketing. (Actually, the whole current CFAR could continue to be the first group; the only necessary thing would be to cooperate with the second one.)
You should multiply the benefit from the advice by the number of people that will receive the advice.
Yeah, it’s not really a linear function. Making one person so super rational that they would build a Friendly AI and save the world may be more useful than teaching thousands of people how to organize their study time better.
But I still suspect that the CFAR approach is to a large degree influenced by “how we expect people in academia to behave”.
I actually spoke to Anna Salamon about this, and she shared that CFAR started by trying a broad outreach approach, and found it was not something they could make work. That’s when they decided to focus on workshops targeting a select group of social elites who would be able to afford their high-quality, high-priced workshops.
And I really appreciate what CFAR is doing—I’m a monthly donor. I think their targeting of founders, hackers, and other techy social elites is great! They can really improve the world through doing so. I also like their summer camps for super-smart kids, and training for Effective Altruists, too.
However, CFAR is not set up to do mass marketing, as you rightly point out. That’s part the reason we set up Intentional Insights in the first place. Anna said she looks forward to learning from what we figure out and collaborating together. Also working with ClearerThinking as well, which I described in my comment here.
teaching thousands of people how to organize their study time better.
Given the amount of akrasia in this community I’m not sure we are at a point where we have a good basis on lecturing other people about this.
Given the current urge propagtion exercise a lot of people who got it taught in person and who have the CFAR texts can’t do it successfully. Iterating on it till it reaches a form that people can take and use would be good.
But I still suspect that the CFAR approach is to a large degree influenced by “how we expect people in academia to behave”.
From my understanding CFAR doesn’t want to convince academia directly and isn’t planning on running any trials themselves at the moment that they will publish.
Actually, the whole current CFAR could continue to be the first group; the only necessary thing would be to cooperate with the second one.
I would appreciate if CFAR would publish their theories publically in writting sooner but I hope the will publish in the next year.
I don’t have access to the CFAR mailing list and I understand that they do get feedback on their writing via the mailing list at the moment.
The first group would keep inventing better and better advice (more or less what CFAR is doing now).
CFAR very recently renamed implentation intentions into Trigger Action Plans (TAP’s). If we already would have marketed implentation intentions widely as vocabulary it would be harder to change the vocabulary.
You should multiply the benefit from the advice by the number of people that will receive the advice.
Landmark reaches quite a lot of people and most of their core ideas aren’t written down in small articles. Scientology would be another organisation that tries to do most idea communication in person. It still reached a lot of people.
When doing Quantified Self community building in Germany, the people who came to our meetups mostly didn’t came because of mainstream media but other sources. It got to the point of another person telling me that giving media interviews is just for fun and not community building.
CFAR mission wasn’t marketing but actually teaching people to be more rational
And this makes a lot of sense, if you assume that advocacy is completely irrelevant to teaching people to be more rational. (‘Marketing’ being just another word for advocacy.) But what if both are necessary and helpful? Then it makes sense for someone to work on marketing rationality—either CFAR itself, Intentional Insights or someone else entirely.
CFAR certainly need to do some form of marketing to get people to come to it’s workshops. As far as I understand CFAR succeeds at the task of doing marketing enough to be able to sell workshops.
As we as a community get better at actually having techniques to make peopel more rational we can step up marketing.
I don’t think the cracked/buzzfeed-style articles are going to change much about the day-to-day decision making of the people who read them. CFAR workshop on the other hand do.
But that isn’t even the whole story. CFAR manages to learn about what works and what doesn’t work through their approach. It allows them to gather empiric data that will in the future also be able to be transmitted through a medium that’s less intensive than a workshop.
This rationalist community isn’t about New Atheism where they know the truth and the problem is mainly that outsiders don’t know the truth and they have to bring the truth to them.
Actually, you’d be surprised at what kind of impact can be had through Lifehacker-type articles. Dust specks if sufficiently large in nature are impactful, after all. And the Lifehacker articles reach many thousands.
Moreover, it’s a question of continuous engagement. Are people getting engaged with this content more than just passing on to another Lifehack article? We have evidence that they are, as described in my comment here.
Actually, you’d be surprised at what kind of impact can be had through Lifehacker-type articles. Dust specks if sufficiently large in nature are impactful, after all.
Dust specks don’t cost a person time to consume. The also have no opportunity cost. Your articles on the other hand might have opportunity cost.
Furthermore it’s not clear that the articles have a positive effect.
As an aside the article’s do have SEO advantages through their links that are worth appreciating even if the people who read them are’t affected.
We have evidence that they are, as described in my comment here.
I don’t see evidence that you succeed in getting people to another site via your articles. Do you have numbers?
I describe the numbers in my comment here about the only website where I have access to the backend.
Compare the article I put out on Lifehack to other articles on Lifehack. Do you think my article on Lifehack has better return on investment than a typical Lifehack article?
My take is that the goal is to give more useful advice than what people are currently getting. For example, giving science-based advice on relationships as I do in this article is more useful than simple experience-based advice, which is what the vast majority of articles on self-improvement do. If it is better than why not do it?
Remember, the primary goal of Intentional Insights is not to market rationality per se, but raise the sanity waterline as such. Promoting rationality comes only after people’s “level of sanity” has been raised sufficiently to engage with things like Less Wrong and CFAR. More in my comment on this topic.
Thanks for the support! We are trying to fill a pretty big void.
However, just to clarify, marketing is only one aspect of what we are doing. We have a much broader agenda, which I describe in my comment here. And I’m always looking for ideas on how to do things better!
To me, it seems that CFAR was supposed to be working on this sort of stuff, and they have not accomplished all that much. So I think, in a way, we should be welcoming the fact that Gleb T./International Insights are now trying to fill this void.
This is not at all obvious to me. If someone tries something, and botches it, then when someone else goes to try that thing they may hear “wait, didn’t that fail the last time around?”
This is not at all obvious to me. If someone tries something, and botches it, then when someone else goes to try that thing they may hear “wait, didn’t that fail the last time around?”
This seems like a full general counterargument against trying uncertain things...
Agreed that it’s a fully general counterargument. I endorse the underlying point, though, of “evaluate second order effects of success and failure as well as first order effects,” and whether or not that point carries the day will depend on the numbers involved.
I’d be curious to hear why you think Intentional Insights is botching it if you think that is the case—it’s not clear from your comment.
However, I disagree with the premise that someone botching something means other people won’t do it. If that was the case, then we would have never had airplanes, for example. People will be actually more likely to try it in order to do something better because they see something has been done before and know the kind of mistakes that were made.
This is not incompatible with marketing persay—marketing is about advocacy, not teaching. And pretty much all effective advocacy has to be targeted to System 1 - the “heart” or the “gut”—in fairly direct terms.
To me, it seems that CFAR was supposed to be working on this sort of stuff, and they have not accomplished all that much. So I think, in a way, we should be welcoming the fact that Gleb T./International Insights are now trying to fill this void. Maybe they aren’t doing it very well at this time, but that’s a separate matter.
CFAR mission wasn’t marketing but actually teaching people to be more rational in a way that helps them actually be more rational. As far as I understand they are making progress on that goal and the workshops they have now are better than at the beginning.
As rationalists we have responsibility for actually giving useful advice. CFAR’s approach of first focusing on figuring out what’s useful advice, instead of first focusing on marketing is good.
Sounds like false dilemma. How about splitting CFAR into two groups? The first group would keep inventing better and better advice (more or less what CFAR is doing now). The second group would take the current results of their research, and try to deliver it to as many people as possible. The second group would also do the marketing. (Actually, the whole current CFAR could continue to be the first group; the only necessary thing would be to cooperate with the second one.)
You should multiply the benefit from the advice by the number of people that will receive the advice.
Yeah, it’s not really a linear function. Making one person so super rational that they would build a Friendly AI and save the world may be more useful than teaching thousands of people how to organize their study time better.
But I still suspect that the CFAR approach is to a large degree influenced by “how we expect people in academia to behave”.
I actually spoke to Anna Salamon about this, and she shared that CFAR started by trying a broad outreach approach, and found it was not something they could make work. That’s when they decided to focus on workshops targeting a select group of social elites who would be able to afford their high-quality, high-priced workshops.
And I really appreciate what CFAR is doing—I’m a monthly donor. I think their targeting of founders, hackers, and other techy social elites is great! They can really improve the world through doing so. I also like their summer camps for super-smart kids, and training for Effective Altruists, too.
However, CFAR is not set up to do mass marketing, as you rightly point out. That’s part the reason we set up Intentional Insights in the first place. Anna said she looks forward to learning from what we figure out and collaborating together. Also working with ClearerThinking as well, which I described in my comment here.
Given the amount of akrasia in this community I’m not sure we are at a point where we have a good basis on lecturing other people about this.
Given the current urge propagtion exercise a lot of people who got it taught in person and who have the CFAR texts can’t do it successfully. Iterating on it till it reaches a form that people can take and use would be good.
From my understanding CFAR doesn’t want to convince academia directly and isn’t planning on running any trials themselves at the moment that they will publish.
I would appreciate if CFAR would publish their theories publically in writting sooner but I hope the will publish in the next year. I don’t have access to the CFAR mailing list and I understand that they do get feedback on their writing via the mailing list at the moment.
CFAR very recently renamed implentation intentions into Trigger Action Plans (TAP’s). If we already would have marketed implentation intentions widely as vocabulary it would be harder to change the vocabulary.
Landmark reaches quite a lot of people and most of their core ideas aren’t written down in small articles. Scientology would be another organisation that tries to do most idea communication in person. It still reached a lot of people.
When doing Quantified Self community building in Germany, the people who came to our meetups mostly didn’t came because of mainstream media but other sources. It got to the point of another person telling me that giving media interviews is just for fun and not community building.
And this makes a lot of sense, if you assume that advocacy is completely irrelevant to teaching people to be more rational. (‘Marketing’ being just another word for advocacy.) But what if both are necessary and helpful? Then it makes sense for someone to work on marketing rationality—either CFAR itself, Intentional Insights or someone else entirely.
CFAR certainly need to do some form of marketing to get people to come to it’s workshops. As far as I understand CFAR succeeds at the task of doing marketing enough to be able to sell workshops.
As we as a community get better at actually having techniques to make peopel more rational we can step up marketing.
Well...cracked/buzzfeed-style articles vs. niche workshops. I wonder which strategy has a broader impact?
I don’t think the cracked/buzzfeed-style articles are going to change much about the day-to-day decision making of the people who read them. CFAR workshop on the other hand do.
But that isn’t even the whole story. CFAR manages to learn about what works and what doesn’t work through their approach. It allows them to gather empiric data that will in the future also be able to be transmitted through a medium that’s less intensive than a workshop.
This rationalist community isn’t about New Atheism where they know the truth and the problem is mainly that outsiders don’t know the truth and they have to bring the truth to them.
Actually, you’d be surprised at what kind of impact can be had through Lifehacker-type articles. Dust specks if sufficiently large in nature are impactful, after all. And the Lifehacker articles reach many thousands.
Moreover, it’s a question of continuous engagement. Are people getting engaged with this content more than just passing on to another Lifehack article? We have evidence that they are, as described in my comment here.
Dust specks don’t cost a person time to consume. The also have no opportunity cost. Your articles on the other hand might have opportunity cost. Furthermore it’s not clear that the articles have a positive effect.
As an aside the article’s do have SEO advantages through their links that are worth appreciating even if the people who read them are’t affected.
I don’t see evidence that you succeed in getting people to another site via your articles. Do you have numbers?
I describe the numbers in my comment here about the only website where I have access to the backend.
Compare the article I put out on Lifehack to other articles on Lifehack. Do you think my article on Lifehack has better return on investment than a typical Lifehack article?
My take is that the goal is to give more useful advice than what people are currently getting. For example, giving science-based advice on relationships as I do in this article is more useful than simple experience-based advice, which is what the vast majority of articles on self-improvement do. If it is better than why not do it?
Remember, the primary goal of Intentional Insights is not to market rationality per se, but raise the sanity waterline as such. Promoting rationality comes only after people’s “level of sanity” has been raised sufficiently to engage with things like Less Wrong and CFAR. More in my comment on this topic.
Thanks for the support! We are trying to fill a pretty big void.
However, just to clarify, marketing is only one aspect of what we are doing. We have a much broader agenda, which I describe in my comment here. And I’m always looking for ideas on how to do things better!
This is not at all obvious to me. If someone tries something, and botches it, then when someone else goes to try that thing they may hear “wait, didn’t that fail the last time around?”
This seems like a full general counterargument against trying uncertain things...
Agreed that it’s a fully general counterargument. I endorse the underlying point, though, of “evaluate second order effects of success and failure as well as first order effects,” and whether or not that point carries the day will depend on the numbers involved.
I’d be curious to hear why you think Intentional Insights is botching it if you think that is the case—it’s not clear from your comment.
However, I disagree with the premise that someone botching something means other people won’t do it. If that was the case, then we would have never had airplanes, for example. People will be actually more likely to try it in order to do something better because they see something has been done before and know the kind of mistakes that were made.