So maybe it’s just me but my reaction to the fanfic was something like “Eliezer is writing rationalist Harry Potter fanfiction. That’s pretty awesome. And educational!” I check it every so often to see if there is a new chapter and I’ve shared it with a couple people. That’s pretty far from:
Basically, Mr. Yudkowsky’s ubiquitous yes-men like to claim (quietly encouraged by the man himself) that this fic somehow transcends its medium to become some sort of higher work
It seemed like most people had similar reactions to mine, but maybe Less Wrongers have been making a bigger deal out of it elsewhere? We didn’t even have this thread until a couple of chapters ago.
Similarly:
apparently Mr. Yudkowsy had a sizable online following prior to writing this tour de force, and I guess some of it must have overlapped with the Potter fandom, because this fic quickly skyrocketed in popularity and number of reviews
My sense was that we had almost nothing to do with the popularity. It didn’t get linked to from LW until like chapter 12 or so, if I remember correctly. I know a couple people here made image macros but Eliezer’s following isn’t nearly large enough to generate this kind of popularity by itself, right?
As for the rest of it: it reads like my mother critiquing MTV, the author doesn’t understand where the author is coming from or who he is writing for and as a result totally misses his target. For example, the fact that Harry has three last names clearly isn’t Eliezer making sloppy feminist statement. If anything, he’s laughing at himself and the subculture he’s a part of. I laughed out loud when I read it because obviously rationalist-Harry would have a compounded name. It’s exactly the right amount of PC-vanity for the family of an Oxford professor with a kid too smart for his own good.
My sense was that we had almost nothing to do with the popularity. It didn’t get linked to from LW until like chapter 12 or so, if I remember correctly. I know a couple people here made image macros but Eliezer’s following isn’t nearly large enough to generate this kind of popularity by itself, right?
I was referred to initially to it by two people who are not LW readers.
The individual writing the blog may be suffering from a bit belief overkill (one of my favorite cognitive biases. Someone should do a top-level post about it at some point. Many different cognitive biases can be thought of in a belief overkill framework).
They’re related. Some argue that confirmation bias is an example of belief overkill. Belief overkill is basically the tendency for people to accept all arguments that support their opinion even if it is only in a peripheral fashion. Thus, for example, people who think that using fetal stem cells for medical purposes are moral are much more likely to think that stem cells will be really medically helpful than people who think that such use is bad. Essentially, people compile arguments for why X is Good/Bad rather than dividing questions properly. There are some posts that touch on this issue (such as those discussing why politics is a mind-killer) but I’m not aware of any post discussing this issue in detail (although given how extensive the archives are I estimate a high probability that I’ve simply missed the relevant ones).
Motivated cognition would also be a special case of belief overkill—it’s being too ready to develop and accept arguments what you want to believe. Belief overkill is the same process applied to arguments from both yourself and other people.
So maybe it’s just me but my reaction to the fanfic was something like “Eliezer is writing rationalist Harry Potter fanfiction. That’s pretty awesome. And educational!” I check it every so often to see if there is a new chapter and I’ve shared it with a couple people. That’s pretty far from:
It seemed like most people had similar reactions to mine, but maybe Less Wrongers have been making a bigger deal out of it elsewhere? We didn’t even have this thread until a couple of chapters ago.
Similarly:
My sense was that we had almost nothing to do with the popularity. It didn’t get linked to from LW until like chapter 12 or so, if I remember correctly. I know a couple people here made image macros but Eliezer’s following isn’t nearly large enough to generate this kind of popularity by itself, right?
As for the rest of it: it reads like my mother critiquing MTV, the author doesn’t understand where the author is coming from or who he is writing for and as a result totally misses his target. For example, the fact that Harry has three last names clearly isn’t Eliezer making sloppy feminist statement. If anything, he’s laughing at himself and the subculture he’s a part of. I laughed out loud when I read it because obviously rationalist-Harry would have a compounded name. It’s exactly the right amount of PC-vanity for the family of an Oxford professor with a kid too smart for his own good.
I was referred to initially to it by two people who are not LW readers.
The individual writing the blog may be suffering from a bit belief overkill (one of my favorite cognitive biases. Someone should do a top-level post about it at some point. Many different cognitive biases can be thought of in a belief overkill framework).
Is belief overkill different from confirmation bias (which is what comes up when I google)?
They’re related. Some argue that confirmation bias is an example of belief overkill. Belief overkill is basically the tendency for people to accept all arguments that support their opinion even if it is only in a peripheral fashion. Thus, for example, people who think that using fetal stem cells for medical purposes are moral are much more likely to think that stem cells will be really medically helpful than people who think that such use is bad. Essentially, people compile arguments for why X is Good/Bad rather than dividing questions properly. There are some posts that touch on this issue (such as those discussing why politics is a mind-killer) but I’m not aware of any post discussing this issue in detail (although given how extensive the archives are I estimate a high probability that I’ve simply missed the relevant ones).
Sounds a lot like the halo effect.
Policy Debates Should Not Appear One-Sided is related to belief overkill, I’d say.
Motivated cognition would also be a special case of belief overkill—it’s being too ready to develop and accept arguments what you want to believe. Belief overkill is the same process applied to arguments from both yourself and other people.