ph’nglui mglw’nafh Eliezer Yudkowsky Clinton Township wgah’nagl fhtagn
Doesn’t really roll off the tongue, does it.
ph’nglui mglw’nafh Eliezer Yudkowsky Clinton Township wgah’nagl fhtagn
Doesn’t really roll off the tongue, does it.
Considering the ridiculous context of the rest of the conversation, (i.e. Dumbledore either pretending to be insane or actually letting some real insanity slip through) is it too far outside the realm of possibility for that comment to be a joke? It seemed like Dumbledore was going out of his way to screw with Harry in this chapter. Even if the machine actually does what he said it does, I could easily see the comment about “how much work it took to nail that down” being a joke Dumbledore told for his own amusement, knowing that Harry was too young to “get it”.
I had to look it up, but I definitely agree. Especially considering how quickly the karma changes reversed after I edited in that footnote.
I wish I could upvote this post back into the positive.
(It seems pretty obvious to me that is a direct satire of the previous post by a similar username. What, no love for sarcasm?)
Such a great game. Seeing this makes me want to play it again, having discovered this site and done some actual reading on transhumanism and AI. It might change the choice I’d make at the end...
Of course, this goes even further than just proving the old saying about Deus Ex, considering you never even mentioned the title!
I know this is a serious necro-post, but I felt compelled.
Now that I think about it, “natural selection” seems more appropriate.
Exactly. I also suspect that logical overconfidence, i.e. knowing a little bit about bias and thinking it no longer affects you, is magnified with higher intelligence.
I can’t help but remember that saying about great power and great responsibility.
Hello, Less Wrong.
Like some others, I eventually found this site after being directed by fellow nerds to HPMOR. I’ve been working haphazardly through the Sequences (getting neck-deep in cognitive science and philosophy before even getting past the preliminaries for quantum physics, and loving every bit of it).
I can’t point to a clear “aha!” moment when I decided to pursue the LW definition of rationality. I always remember being highly intelligent and interested in Science, but it’s hard for me to model how my brain actually processed information that long ago. Before high school (at the earliest), I was probably just as irrational as everyone else, only with bigger guns.
Sometime during college (B.S. in mechanical engineering), I can recall beginning an active effort to consider as many sides of an issue as possible. This was motivated less from a quest for scientific truth and more from a tendency to get into political discussions. Having been raised by parents who were fairly traditional American conservatives, I quickly found myself becoming some kind of libertarian. This seems to be a common occurrence, both in the welcome comments I’ve read here and elsewhere. I can’t say at this point how much of this change was the result of rational deliberation and how much was from mere social pressure, but on later review it still seems like a good idea regardless.
The first time I can recall actually thinking “I need to improve the way I think” was fairly recent, in graduate school. The primary motivation was still political. I wanted to make sure my beliefs were reasonable, and the first step seemed to be making sure they were self-consistent. Unfortunately, I still didn’t know the first thing about cognitive biases (aside from running head-on into confirmation bias on a regular basis without knowing the name). Concluding that the problem was intractable, I withdrew from all friendly political discussion except one in which my position seemed particularly well-supported and therefore easy to argue rationally. I never cared much for arguing in the first place, so if I’m going to do it I’d prefer to at least have the data on my side.
I’ve since lost even more interest in trying to figure out politics, and decided while reading this site that it would be more immediately important anyway to try figuring out myself. I’ve yet to identify that noble cause to fight for (although I have been interested in manned space exploration enough to get two engineering degrees), but I think a more rational me will be more effective at whatever that cause turns out to be.
Still reading and updating...
(I’m neither a theology scholar nor an anthropologist, so I may lack some important background on this.)
I agree that the idea of early church leaders isolating members in order to explicitly limit the introduction of new ideas sounds far-fetched. It strikes me as the kind of thing that would only be said after the fact, by a historian looking for meaning in the details. But attributing those member-isolating rules to something like “preserving group identity” seems like the same thing.
I find myself wondering if something like the anthropic principle is at work here, i.e. the only religious groups to survive that long are the ones who historically isolated their members from outside ideas. There’s probably a more general term for what I’m getting at.
When life gives you lemons, order miracle berries.