TGGP: A more convincing counterexample to this pattern would be Hermes.
Daniel_Franke
Hmm, I just noticed that there’s a slight contradiction here:
“I know. Believe me, I know. Only youth can Administrate. That is the pact of immortality.”
Then how is it possible for there to be such person as a Lord Administrator, if the title takes 100 years to obtain? While a civilization of immortals would obviously redefine their concept of youth, it seems like a stretch to call a centenarian young if 500 is still considered mind-bogglingly old.
Kevin, seconded. I’m half-expecting Eliezer to copy-paste a few paragraphs from the climax of Foundation’s Edge into the middle of the story in order to see if anyone notices :-)
How is it that these aliens’ anatomy is so radically different from humans’, yet they have a word for “kick”?
The URL for the site ought to be community.overcomingbias.com. That doesn’t have to be its title; you can still title it “Less Wrong”.
Economic weirdtopia: being rich is socially unacceptable; not because the society values equality, but because it’s considered decadent and, in a certain sense, cheating. Weirdtopia’s system of morality is virtue-based, and one of their highest virtues is a peculiar sort of self-sufficiency. Essentially, you’re expected to be able to make yourself safe and comfortable by relying only on your wits and not on material goods. Needing to consume natural resources is accepted as a fact of life, but you should be able to do as much as possible with as little as possible. There is no concept of land ownership. In a loose sense of the word “own”, you own the chattels that you produce with your own hands, but accepting the products of others’ labor is a vice.
Exchanging knowledge and techniques is normal and acceptable. Being knowledgable about things that others have discovered is entirely amoral. Innovating earns you respect, but equally so regardless of whether you’re the first to ever discover something or whether you figured out something widely-known on your own.
Silas, what do you mean by a subjective feeling of discontinuity, and why is it an ethical requirement? I have a subjective feeling of discontinuity when I wake up each morning, but I don’t think that means anything terrible has happened to me.
EY, I’m not following your comment about CI versus Alcor. What do you see as the benefits of choosing Alcor, and what does your age have to do with choosing to forego them?
I second Roland’s suggestion of a two-tier system.
Yvian, I too am surprised to be told that there are many people who aren’t at stage 2. It’s not a bit surprising that most people aren’t at stage 3. I’ve been capable of that kind of thought for as long as I can remember, but it’s only since maybe 17 (I’m currently 23) that I’ve actually had a habit of thinking that way.
I’m amused by the number of people on this thread saying that they’ve acquired thought habit X through overcoming a mental disorder.
mtraeven: Why are hardcore materialists, who presumably have no truck with Cartesian mind/body dualism, so eager to embrace brain/body dualism? Or software/hardware dualism?
Has anyone but me brought up software/hardware dualism? I’m only using it metaphorically. I’m not claiming any fundamental, bright-line distinction.
EY: You find out how to disable pieces of yourself. Then one day you find that you’ve disabled too much. It doesn’t necessarily have anything to do with religion or even with beliefs, except for whatever beliefs spurred you to start deleting pieces of yourself.
Okay, I now see what you’re saying. I haven’t experienced it. I understand the trick of disabling pieces of oneself, but I’ve never in my recollection abused it. However, I can understand what it would be like because I’ve experienced something that I’m guessing is similar: I’m a high-functioning autistic, and I’ve had to put considerable effort into software emulation of the emotional hardware that I’m missing.
Sorry, I botched the second-to-last sentence. It should read “For example, if you were a cult member between 2004 and 2006, and you_2008 consider both you_2005 and you_2002 to be fools, you_2005 considers you_2002 a fool, but you_2008 consider you_2002 wiser than you_2005, then count that as one improvement rather than two.”
EY: I remember my own recovery as being more like a chain of “Undos” to restore the original state.
Presuming you’re referring to your religious upbringing, that seems like a funny way of characterizing it. Virtually every old primitive civilization that we know about had religious superstitions that all look pretty similar to each other and whose differences are mostly predictable given the civilization’s history and geography. (I say “virtually” just as cover; I don’t actually know any exceptions). Modern Judaism is a whole lot saner than any of these, and even somewhat saner than most modern mainstream alternatives. So it seems to me that your parents did pretty well: what they taught you was far from ideal, but it’s lot better than what you would likely have come up with on your own if you had been raised by wolves. Rejecting religion is development, not rehabilitation, because religion isn’t active stupidity; merely the rest state of an ignorant mind.
Cassandra: One of my first memories is relentlessly purging my early childhood personality shortly after I discovered how to perform the trick then panicking and rebuilding a new self any sort of stuff laying around. Still think that rampant self-modification left scars on my mind that are still there today.
I’m with EY in finding this unusual. Since the point of having a physically-developed brain, I’ve been through five cataclysmic adjustments to my worldview, each spurred by the influence of a particular writer (Ayn Rand, Eric Raymond, Paul Graham, Murray Rothbard, and Richard Dawkins in that sequence, with EY currently vying to be #6). But these have always been exciting, not frightening or traumatic even at the most reptilian level. When I come across a writer with a surprising philosophy that I’m intrigued by but decide to reject, I’m disappointed, not relieved. I can’t relate to it feeling otherwise.
N.b., discarding religion was not a cataclysm. It was pretty gradual. I was labeling myself a Pascal’s Wager agnostic since before my Bar Mitzvah, and by the time I came across Dawkins I was already an atheist. Dawkins merely brought me out of the closet, getting me to take pride in being an atheist and to denounce superstition rather than just passively reject it.
I’d like to propose a “personal development score” of sorts. Most adults consider their teenage self a fool. How many times over have you gotten to this point? That is, think of the most recent revision of you which your current self considers a fool. Then recurse back to that point and determine what you would have answered then. How many times can you recurse before you reach back to childhood? Deduct obvious regressions. For example, if you were a cult member between 2004 and 2006, and you_2008 consider both you_2005 and you_2002 to be fools, you_2005 considers you_2002 a fool, but you_2008 consider you_2005 wiser than you_2002, then count that as one improvement rather than two. Then divide by (your age − 13).
...: I don’t think the implementation of a Friendly AI is any harder than the specification of what constitutes Friendly AI plus the implementation of an unFriendly AGI capable of implementing the specification.
As for the idea of competing AIs, if they can modify each other’s code, what’s to keep one from just deleting the other?
Actually, chess players do care about the metric you stated. It’s a good proxy for the current usefulness of your bishops.
Emperor Claudius I is the best candidate I can think of for a good ruler who took power by dubious means.
You haven’t earned the right to say X.
I think that one is poorly-phrased but defensible. You can think of it as short hand for “Your life experiences have provided you with an insufficient collection of Bayesian priors to permit you to assert X with any reasonable certainty”.
Would you still disagree with that one if “the devil” was replaced by “a strong AI”?
Yes. Suffice it to say I don’t think I’d be a very reliable gatekeeper :-).
(Conversely, I don’t even think the AI’s job in the box experiment is even hard, much less impossible. Last week, I posted a $15 offer to play the AI in a run of the experiment, but my post disappeared somehow.)
A step in the right direction: http://money.cnn.com/2009/02/25/news/companies/banks_test/index.htm?postversion=2009022514