You want to teach me how to win? Show me a million bucks in your bank account.
I guess CFAR should let Peter Thiel teach in their workshops. Or, more seriously, use his name (assuming he agrees with this) when promoting their workshops.
When I think about this more, there is a deeper objection. Something like this: “So, you believe you are super rational and super winning. And you are talking to me, although I don’t believe you, so you are wasting your time. Is that really the best use of your time? Why don’t you do something… I don’t know exactly what… but something thousand times more useful now, instead of talking to me? Because this kind of optimization is precisely the one you claim to be good at; obviously you’re not.”
And this is an objection that makes sense. I mean, it’s like if someone is trying to convince me that if I invest my money in his plan, my money will double… it’s not that I don’t believe in a possibility of doubling the money; it’s more like: why doesn’t this guy double his own money instead? -- Analogicaly, if you have superpowers that allow you to win, then why the heck are you not winning right now instead of talking to me?
EDIT: I guess we should reflect on our actions when we are trying to convince someone else about usefulness of rationality. I mean, if someone resists the idea of LW-style rationality, is it rational (is it winning on average) to spend my time trying to convince them, or should I just say “next” and move to another person? I mean, there are seven billion people on this planet, half million in the city where I live, so if one person does not like this idea, it’s not like my efforts are doomed… but I may doom them by wasting all my energy and time on trying to convince this specific person. Some people just aren’t interested, and that’s it.
Yep, that’s a valid and serious objection, especially in the utilitarian context.
A couple of ways to try to deal with it: (a) point out the difference between instrumentality and goals. (b) point out that rationality is not binary but a spectrum, it’s not a choice between winning all and winning nothing.
You can probably also reformulate the whole issue as “help you to deal with the life’s problems—let me show you how can you go about it without too much aggravation and hassle”...
I guess CFAR should let Peter Thiel teach in their workshops. Or, more seriously, use his name (assuming he agrees with this) when promoting their workshops.
When I think about this more, there is a deeper objection. Something like this: “So, you believe you are super rational and super winning. And you are talking to me, although I don’t believe you, so you are wasting your time. Is that really the best use of your time? Why don’t you do something… I don’t know exactly what… but something thousand times more useful now, instead of talking to me? Because this kind of optimization is precisely the one you claim to be good at; obviously you’re not.”
And this is an objection that makes sense. I mean, it’s like if someone is trying to convince me that if I invest my money in his plan, my money will double… it’s not that I don’t believe in a possibility of doubling the money; it’s more like: why doesn’t this guy double his own money instead? -- Analogicaly, if you have superpowers that allow you to win, then why the heck are you not winning right now instead of talking to me?
EDIT: I guess we should reflect on our actions when we are trying to convince someone else about usefulness of rationality. I mean, if someone resists the idea of LW-style rationality, is it rational (is it winning on average) to spend my time trying to convince them, or should I just say “next” and move to another person? I mean, there are seven billion people on this planet, half million in the city where I live, so if one person does not like this idea, it’s not like my efforts are doomed… but I may doom them by wasting all my energy and time on trying to convince this specific person. Some people just aren’t interested, and that’s it.
Yep, that’s a valid and serious objection, especially in the utilitarian context.
A couple of ways to try to deal with it: (a) point out the difference between instrumentality and goals. (b) point out that rationality is not binary but a spectrum, it’s not a choice between winning all and winning nothing.
You can probably also reformulate the whole issue as “help you to deal with the life’s problems—let me show you how can you go about it without too much aggravation and hassle”...