The third horn of the anthropic trilemma is to deny that there is any meaningful sense whatsoever in which you can anticipate being yourself in five seconds, rather than Britney Spears; to deny that selfishness is coherently possible; to assert that you can hurl yourself off a cliff without fear, because whoever hits the ground will be another person not particularly connected to you by any such ridiculous thing as a “thread of subjective experience”.
A question of rationality. Eliezer, I have talked to a few Less Wrongers about what horn they take on the anthropic trilemma; sometimes letting them know beforehand what my position was, sometimes giving no hint as to my predispositions. To a greater or lesser degree, the following people have all endorsed taking the third horn of the trilemma (and also see the part that goes from ‘to deny selfishness as coherently possible’ to the end of the bullet point as a non sequitur): Steve Rayhawk, Zack M. Davis, Marcello Herreshoff, and Justin Shovelain. I believe I’ve forgotten a few more, but I know that none endorsed any horn but the third. I don’t want to argue for taking the third horn, but I do want to ask: to what extent does knowing that these people take the third horn cause you to update your expected probability of taking the third horn if you come to understand the matter more thoroughly? A few concepts that come to my mind are ‘group think’, majoritarianism, and conservation of expected evidence. I’m not sure there is a ‘politically correct’ answer to this question. I also suspect (perhaps wrongly) that you also favor the third horn but would rather withhold judgment until you understand the issue better; in which case, your expected probability would probably not change much.
[Added metaness: I would like to make it very especially clear that I am asking a question, not putting forth an argument.]
The fourth horn of the anthropic trilemma is to deny that increasing the number of physical copies increases the weight of an experience, which leads into Boltzmann brain problems, and may not help much (because alternatively designed brains may be able to diverge and then converge as different experiences have their details forgotten).
Suppose I build a (conscious) brain in hardware using today’s technology. It uses a very low current density, to avoid electromigration.
Suppose I build two of them, and we agree that both of them experience consciousness.
Then I learn a technique for treating the wafers to minimize electromigration. I create a new copy of the brain, the same as the first copy, only using twice the current, and hence being implemented by a flow of twice as many electrons.
As far as the circuits and the electrons travelling them are concerned, running it is very much like running the original 2 brains physically right next to each other in space.
So, does the new high-current brain have twice as much conscious experience?
I’m not as versed in this trilemma as I’d like to be, so I’m not sure whether that final question is rhetorical or not, though I suspect that it is. So mostly for my own benefit:
While there’s no denying that subjective experience is ‘a thing’, I see no reason to make that abstraction obey rules like multiplication. The aeroplane exists at a number of levels of abstraction above the atoms it’s composed of, but we still find it a useful abstraction. The ‘subjective experiencer’ is many, many levels higher again, which is why we find it so difficult to talk about. Twice as many atoms doesn’t make twice as much aeroplane, the very concept is nonsense. Why would we think any differently about the conscious self?
My response to the ‘trilemma’ is as it was when I first read the post—any sensible answer isn’t going to look like any of those three, it’s going to require rewinding back past the ‘subjective experience’ concept and doing some serious reduction work. ‘Is there twice as much experience?’ and ‘are you the same person?’ just smell like such wrong questions to me. Anyone else?
Nick, will have a look at that Bostrom piece, cheers.
Nick Bostrom’s “Quantity of Experience” discusses similar issues. His model would, I think, answer “no”, since the structure of counterfactual dependences is unchanged.
http://lesswrong.com/lw/19d/the_anthropic_trilemma/
A question of rationality. Eliezer, I have talked to a few Less Wrongers about what horn they take on the anthropic trilemma; sometimes letting them know beforehand what my position was, sometimes giving no hint as to my predispositions. To a greater or lesser degree, the following people have all endorsed taking the third horn of the trilemma (and also see the part that goes from ‘to deny selfishness as coherently possible’ to the end of the bullet point as a non sequitur): Steve Rayhawk, Zack M. Davis, Marcello Herreshoff, and Justin Shovelain. I believe I’ve forgotten a few more, but I know that none endorsed any horn but the third. I don’t want to argue for taking the third horn, but I do want to ask: to what extent does knowing that these people take the third horn cause you to update your expected probability of taking the third horn if you come to understand the matter more thoroughly? A few concepts that come to my mind are ‘group think’, majoritarianism, and conservation of expected evidence. I’m not sure there is a ‘politically correct’ answer to this question. I also suspect (perhaps wrongly) that you also favor the third horn but would rather withhold judgment until you understand the issue better; in which case, your expected probability would probably not change much.
[Added metaness: I would like to make it very especially clear that I am asking a question, not putting forth an argument.]
From EY’s post:
Suppose I build a (conscious) brain in hardware using today’s technology. It uses a very low current density, to avoid electromigration.
Suppose I build two of them, and we agree that both of them experience consciousness.
Then I learn a technique for treating the wafers to minimize electromigration. I create a new copy of the brain, the same as the first copy, only using twice the current, and hence being implemented by a flow of twice as many electrons.
As far as the circuits and the electrons travelling them are concerned, running it is very much like running the original 2 brains physically right next to each other in space.
So, does the new high-current brain have twice as much conscious experience?
I’m not as versed in this trilemma as I’d like to be, so I’m not sure whether that final question is rhetorical or not, though I suspect that it is. So mostly for my own benefit:
While there’s no denying that subjective experience is ‘a thing’, I see no reason to make that abstraction obey rules like multiplication. The aeroplane exists at a number of levels of abstraction above the atoms it’s composed of, but we still find it a useful abstraction. The ‘subjective experiencer’ is many, many levels higher again, which is why we find it so difficult to talk about. Twice as many atoms doesn’t make twice as much aeroplane, the very concept is nonsense. Why would we think any differently about the conscious self?
My response to the ‘trilemma’ is as it was when I first read the post—any sensible answer isn’t going to look like any of those three, it’s going to require rewinding back past the ‘subjective experience’ concept and doing some serious reduction work. ‘Is there twice as much experience?’ and ‘are you the same person?’ just smell like such wrong questions to me. Anyone else?
Nick, will have a look at that Bostrom piece, cheers.
Nick Bostrom’s “Quantity of Experience” discusses similar issues. His model would, I think, answer “no”, since the structure of counterfactual dependences is unchanged.