Is it possible that some of the reported “rationality content” was more like genre-savviness which is more visible to people who are very familiar with the genre in question?
Vulture
I really like the Inuit one.
Thank god for the use-mention distinction :-)
I occasionally remember to keep pencil + paper by my bed for this reason, so that I can write such things down in the dark without having to get up or turn on a light. Even if the results aren’t legible in the usual sense, I’ve almost always been able to remember what they were about in the morning.
Eliezer seems to be really really bad at acquiring or maintaining status. I don’t know how aware of this fault he is, since part of the problem is that he consistently communicates as if he’s super high status.
Musk thinks there’s an issue in the 5-7 year timeframe
Hopefully his enthusiasm (financially) isn’t too dampened when that fails to be vindicated.
I, for one, would look forward more to being Evassarated.
Thank you for articulating this so well :-)
I think the survey is pushed by SJW trolls
What does this even mean?
I think that we use “Best” (which is a complicated thing other than “absolute points”) rather than “Top” (absolute points) precisely to reduce the effectiveness of that strategy.
- Dec 15, 2014, 4:57 AM; 4 points) 's comment on Rationality Quotes December 2014 by (
For what it’s worth, I perceived the article as more affectionate than offensive when I initially read it. This may have something to do with full piece vs. excerpts, so I’d recommend reading the full piece (which isn’t that much longer) first if you care.
- Dec 12, 2014, 11:49 PM; 1 point) 's comment on Harper’s Magazine article on LW/MIRI/CFAR and Ethereum by (
In addition to what gwern said, it’s worth bearing in mind that Harper’s is a very literary sort of magazine, and its typical style is thus somewhat less straightforward than most news.
If many people dismiss LW and MIRI and CFAR for similar reasons, then the only rational response is to identify how that “this is ridiculous” response can be prevented.
I agree with your overall point, but I think that “this is ridiculous” is not really the author’s main objection to the LW-sphere; it’s clearer in the context of the whole piece, but they’re essentially setting up LW/MIRI/CFAR as typical of Silicon Valley culture(!), a collection of mad visionaries (in a good way) whose main problem is elitism; ethereum is then presented as a solution to this problem, or at least indicative of better attitude. I don’t necessarily agree with any of this, but that’s what the thesis of the article seems to be.
I have to say I appreciated the first description of LessWrong as “confoundingly scholastic”.
And here it is, as a pdf! (I finally thought of trying to log in as a subscriber)
I have it in hard copy, but all attempts so far to scan or photograph it have been foiled. I’m working on it, though; by far the best media piece on Less Wrong I’ve seen so far.
ETA—To give you an idea: the author personally attended a CFAR workshop and visited MIRI, and upon closer inspection one can make out part of the Map of Bay Area Memespace in one of the otherwise-trite collage illustrations.
Oh, I think we’re using the phrase “political movement” in different senses. I meant something more like “group of people who define themselves as a group in terms of a relatively stable platform of shared political beliefs, which are sufficiently different from the political beliefs of any other group or movement”. Other examples might be libertarianism, anarcho-primitivism, internet social justice, etc.
I guess this is a non-standard usage, so I’m open to recommendations for a better term.
In terms of Death Note, I’ve read the first several volumes and can vouch that it’s a fun, “cerebral” mystery/thriller, especially if you like people being ludicrously competent at each other, having conversations with multiple levels of hidden meaning, etc. Can’t say there’s anything super rational about it, but the aesthetic is certainly there.