He never finished high school, but self taught himself a bunch of stuff.
Is this really the best second sentence to have? This, plus a few pieces later (like saying LW is fringe-y and cult-y before calling it mostly about noncontroversial things) seems like you’re optimizing around an objection you’re imagining the listener has (“isn’t that place Yudkowsky’s cult?”), which causes them to think that even if they weren’t already.
That is, the basic structure here is something like:
Founders
Broad description of beliefs
Detailed description of beliefs
Problems
Community
I suspect you’re better off with a structure like:
We know a lot more about thinking now than we did in the past, and it seems like thinking about thinking has multiplicative effects. This is especially important today, given how much work is knowledge work.
There’s a cluster of people interested in that who gathered around a clear explanation of the sort of worldview you’d build today as a cognitive psychologist and a computer programmer, that you couldn’t have built in the past but is built on the past. That is, the fruit of lots of different intellectual traditions have fertilized the roots of this one.
As an example of this, a core concept, “the map is not the territory,” comes from General Semantics through Hayakawa. What it means is that we have mental models of external reality that, from the inside, seem to be reality, but are different, just like Google Maps might look a lot like the surface of the Earth but it isn’t. This sort of mental separation between beliefs and reality allows for a grounded understanding of the relationships between one’s beliefs and reality, which has lots of useful downstream effects.
But that’s just one out of many concepts; the really cool thing about the rationality community is that when everyone has the same language (and underlying concepts), they can talk much faster about much more interesting things, cutting quickly to the heart of matters and expanding the frontiers of understanding. Lots of dumb arguments just don’t happen, because everyone knows how to avoid them.
I get the impression that a lot of people start off with a feeling that it’s weird and cult-y. For that reason, I feel it’s important to address it and communicate that “actually, rationality is normal”. If you didn’t already find it to be weird (and wouldn’t have come to find it weird after some initial investigation), my intuition is that such a forewarning wouldn’t lead you to consider it weird, and thus has a minimal downside. I feel somewhat confident about that intuition, but not too confident.
This would be an interesting thing to test though. And I look forward to updating my beliefs based on what the experiences and intuitions of others are regarding this.
“actually, X” is never a good way to sell anything. Scientists are quite prone to this kind of speech which from their perspective is fully justified ( because they’ve exhaustively studied a certain topic ) - but what the average person hears is the “you don’t know what you’re talking about” half of the implication which makes them deaf to the “I do know what I’m talking about” half. If you just place the fruits of rationality on display; anyone with a brain will be able to recognize them for what they are and they’ll adjust their judgements accordingly.
Here’s an interesting exercise—find anyone in the business of persuasion ( a lawyer, a salesman, a con artist ) and see how often you hear them say things like “no, actually...” ( or how often you hear them not saying these things ).
My impression: a major issue is that other people get the idea that LessWrong comes from a few people preaching their ideas, when in reality, it’s people who mostly preach the ideas that have been discovered by and are widely agreed upon by academic experts. Just saying, “it comes from academics” seems to not directly address this major issue directly enough.
That said, I see what you mean about “actually, X” being a pattern that may lead people to instinctively argue the other way. So I see that there is a cost, but my impression is that the cost doesn’t outweigh the benefit that comes with directly addressing a major concern that others have. For most audiences; there are certainly some less charitable audiences who need to be approached more gently.
I’d consider my confidence in this to be moderate. Getting your data point has lead to me shift downwards a bit.
Hate to have to say this but directly addressing a concern is social confirmation of a form that the concern deserves to be addressed, and thus that it’s based in something real. Imagine a Scientologist offering to explain to you why Scientology isn’t a cult.
Of the people I know of who are outright hostile to LW, it’s mostly because of basilisks and polyamory and other things that make LW both an easy and a fun target for derision. And we can’t exactly say that those things don’t exist.
Hate to have to say this but directly addressing a concern is social confirmation of a form that the concern deserves to be addressed, and thus that it’s based in something real.
I could see some people responding that way. But I could see others responding with, “oh, ok—that makes sense”. Or maybe, “hm, I can’t tell whether this is legit—let me look into it further”. There are lots of citations and references in the LessWrong writings, so it’s hard to argue with the fact that it’s heavily based off of existing science.
Still, there is the risk of some people just responding with, “Jeez, this guy is getting defensive already. I’m skeptical. This LessWrong stuff is not for me.” I see that directly addressing a concern can signal bad things and cause this reaction, but for whatever reason, my brain is producing a feeling that this sort of reaction will be the minority in this context (in other contexts, I could see the pattern being more harmful). I’m starting to feel less confident in that, though. I have to be careful not to Typical Mind here. I have an issue with Typical Minding too much, and know I need to look out for it.
The good thing is that user research could totally answer this question. Maybe that’d be a good activity for a meet-up group or something. Maybe I’ll give it a go.
If you just place the fruits of rationality on display; anyone with a brain will be able to recognize them for what they are and they’ll adjust their judgements accordingly.
Hm, probably not. Seems unnecessarily to risk giving an even “cult-ier” impression. Also seems worthwhile to be more specific about why I claim that he’s smart. Changed, thanks.
Is this really the best second sentence to have? This, plus a few pieces later (like saying LW is fringe-y and cult-y before calling it mostly about noncontroversial things) seems like you’re optimizing around an objection you’re imagining the listener has (“isn’t that place Yudkowsky’s cult?”), which causes them to think that even if they weren’t already.
That is, the basic structure here is something like:
Founders
Broad description of beliefs
Detailed description of beliefs
Problems
Community
I suspect you’re better off with a structure like:
We know a lot more about thinking now than we did in the past, and it seems like thinking about thinking has multiplicative effects. This is especially important today, given how much work is knowledge work.
There’s a cluster of people interested in that who gathered around a clear explanation of the sort of worldview you’d build today as a cognitive psychologist and a computer programmer, that you couldn’t have built in the past but is built on the past. That is, the fruit of lots of different intellectual traditions have fertilized the roots of this one.
As an example of this, a core concept, “the map is not the territory,” comes from General Semantics through Hayakawa. What it means is that we have mental models of external reality that, from the inside, seem to be reality, but are different, just like Google Maps might look a lot like the surface of the Earth but it isn’t. This sort of mental separation between beliefs and reality allows for a grounded understanding of the relationships between one’s beliefs and reality, which has lots of useful downstream effects.
But that’s just one out of many concepts; the really cool thing about the rationality community is that when everyone has the same language (and underlying concepts), they can talk much faster about much more interesting things, cutting quickly to the heart of matters and expanding the frontiers of understanding. Lots of dumb arguments just don’t happen, because everyone knows how to avoid them.
I get the impression that a lot of people start off with a feeling that it’s weird and cult-y. For that reason, I feel it’s important to address it and communicate that “actually, rationality is normal”. If you didn’t already find it to be weird (and wouldn’t have come to find it weird after some initial investigation), my intuition is that such a forewarning wouldn’t lead you to consider it weird, and thus has a minimal downside. I feel somewhat confident about that intuition, but not too confident.
This would be an interesting thing to test though. And I look forward to updating my beliefs based on what the experiences and intuitions of others are regarding this.
“actually, X” is never a good way to sell anything. Scientists are quite prone to this kind of speech which from their perspective is fully justified ( because they’ve exhaustively studied a certain topic ) - but what the average person hears is the “you don’t know what you’re talking about” half of the implication which makes them deaf to the “I do know what I’m talking about” half. If you just place the fruits of rationality on display; anyone with a brain will be able to recognize them for what they are and they’ll adjust their judgements accordingly.
Here’s an interesting exercise—find anyone in the business of persuasion ( a lawyer, a salesman, a con artist ) and see how often you hear them say things like “no, actually...” ( or how often you hear them not saying these things ).
My impression: a major issue is that other people get the idea that LessWrong comes from a few people preaching their ideas, when in reality, it’s people who mostly preach the ideas that have been discovered by and are widely agreed upon by academic experts. Just saying, “it comes from academics” seems to not directly address this major issue directly enough.
That said, I see what you mean about “actually, X” being a pattern that may lead people to instinctively argue the other way. So I see that there is a cost, but my impression is that the cost doesn’t outweigh the benefit that comes with directly addressing a major concern that others have. For most audiences; there are certainly some less charitable audiences who need to be approached more gently.
I’d consider my confidence in this to be moderate. Getting your data point has lead to me shift downwards a bit.
Hate to have to say this but directly addressing a concern is social confirmation of a form that the concern deserves to be addressed, and thus that it’s based in something real. Imagine a Scientologist offering to explain to you why Scientology isn’t a cult.
Of the people I know of who are outright hostile to LW, it’s mostly because of basilisks and polyamory and other things that make LW both an easy and a fun target for derision. And we can’t exactly say that those things don’t exist.
I could see some people responding that way. But I could see others responding with, “oh, ok—that makes sense”. Or maybe, “hm, I can’t tell whether this is legit—let me look into it further”. There are lots of citations and references in the LessWrong writings, so it’s hard to argue with the fact that it’s heavily based off of existing science.
Still, there is the risk of some people just responding with, “Jeez, this guy is getting defensive already. I’m skeptical. This LessWrong stuff is not for me.” I see that directly addressing a concern can signal bad things and cause this reaction, but for whatever reason, my brain is producing a feeling that this sort of reaction will be the minority in this context (in other contexts, I could see the pattern being more harmful). I’m starting to feel less confident in that, though. I have to be careful not to Typical Mind here. I have an issue with Typical Minding too much, and know I need to look out for it.
The good thing is that user research could totally answer this question. Maybe that’d be a good activity for a meet-up group or something. Maybe I’ll give it a go.
Behold LW!
:-)
Hm, probably not. Seems unnecessarily to risk giving an even “cult-ier” impression. Also seems worthwhile to be more specific about why I claim that he’s smart. Changed, thanks.