This is commentary I started making as I was reading the first quote. I think some bits of the post are a bit vague or confusing but I think I get what you mean by anthropic measure, so it’s okay in service to that. I don’t think equating anthropic measure to mass makes sense, though; counter examples seem trivial.
> The two instances can make the decision together on equal footing, taking on exactly the same amount of risk, each- having memories of being on the right side of the mirror many times before, and no memories of being on the wrong- tacitly feeling that they will go on to live a long and happy life.
feels a bit like like quantum suicide.
note: having no memories of being on the wrong side does not make this any more pleasant an experience to go through, nor does it provide any reassurance against being the replica (presuming that’s the one which is killed).
> As is custom, the loser speaks first.
naming the characters papers and scissors is a neat idea.
> Paper wonders, what does it feel like to be… more? If there were two of you, rather than just one, wouldn’t that mean something? What if there were another, but it were different?… so that- > > [...] > > Scissors: “What would it feel like? To be… More?… What if there were two of you, and one of me? Would you know?”
isn’t paper thinking in 2nd person but then scissors in 1st? so paper is thinking about 2 ppl but scissors about 3 ppl?
> It was true. The build that plays host to the replica (provisionally named “Wisp-Complete”), unlike the original’s own build, is effectively three brains interleaved
Wait, does this now mean there’s 4 ppl? 3 in the replica and 1 in the non-replica?
> Each instance has now realised that the replica- its brain being physically more massive- has a higher expected anthropic measure than the original.
Um okay, wouldn’t they have maybe thought about this after 15 years of training and decades of practice in the field?
> It is no longer rational for a selfish agent in the position of either Paper nor Scissors to consent to the execution of the replica, because it is more likely than not, from either agent’s perspective, that they are the replica.
I’m not sure this follows in our universe (presuming it is rational when it’s like 1:1 instead of 3:1 or whatever). like I think it might take different rules of rationality or epistemology or something.
> Our consenters have had many many decades to come to terms with these sorts of situations.
Why are Paper and Scissors so hesitant then?
> That gives any randomly selected agent that has observed that it is in the mirror chamber a 3⁄4 majority probability of being the replica, rather than being the original.
I don’t think we’ve established sufficiently that the 3 minds 1 brain thing are actually 3 minds. I don’t think they qualify for that, yet.
> But aren’t our consenters perfectly willing to take on a hefty risk death in service of progress? No. Most Consenters aren’t. Selling one’s mind and right to life in exchange for capital would be illegal.
Why would it be a hefty risk? Isn’t it 0% chance of death? (the replicant is always the one killed)
> In a normal mirror chamber setup, when the original enters the mirror chamber, they are confident that it is the original who will walk out again. They are taking on no personal risk. None is expected, and none is required.
Okay we might be getting some answers soon.
> The obvious ways of defecting from protocol- an abdication of the responsibility of the consenter, a refusal to self-murder, an attempt to replicate without a replication license- are taken as nothing less than Carcony.
Holy shit this society is dystopic.
> It would be punished with the deaths of both copies and any ancestors of less than 10 years of divergence or equivalent.
O.O
> But if, somehow, the original were killed? What if neither instance of the Consenter signed for their replica’s execution, and the replica were left alive. That would not be considered Carcony. It would not even be considered theft- because a brain always belongs to its mind.
I’m definitely unclear on the process for deciding; wouldn’t like only one guillatine be set up and both parties affixed in place? (Moreover, why wouldn’t the replica just be a brain and not in a body, so no guillatine, and just fed visual inputs along with the mirror-simulation in the actual room—sounds feasilble)
> What if neither instance of the Consenter signed for their replica’s execution
Wouldn’t this be an abdication of responsibility as mentioned in the prev paragraph?
> So, do you see now? Do you see how Consenter Nai Paper-Chell-Glass-Stratton was motivated by a simple alignment of the payoff matrices?
Presumably to run away with other-nai-x3-in-a-jar-stratton?
> Paper: “You wouldn’t do that to me. Look… if you’re the original… And I do myself, and I’m the replica. I wont actually be dead, because if you destroy a representation of your brain case in a simulation that doesn’t actually destroy you in real life. I might not even desync. Whatever happens, I’ll still be alive after the penetration so I’ll know I’m the replica, but you might not know. It might look like I’m really dead. And you’ll have no incentive to follow through and do yourself at that point.”
> Scissors: “I still don’t see it.”
So both parties sign for the destruction of the replica, but only the legit Nai’s signing will actually trigger the death of the replica. The replica Nai’s signing will only SIMULATE the death of a simulated replica Nai (the “real” Nai being untouched) - though if this happened wouldn’t they ‘desync’ - like not be able to communicate? (presuming I understand your meaning of desync)
> Paper: ”… If you’re the replica, it doesn’t matter whether you do yourself, you’ll still get saved either way, but you’re incented not to do yourself because having a simulated spike stuck through your simulated head will probably be pretty uncomfortable. But also, if you’re the original, you’re sort of doomed either way, you’re just incented to run off and attempt Carcony, but there’s no way the replica would survive you doing that, and you probably wouldn’t either, you wouldn’t do that to me. Would you?”
I don’t follow the “original” reasoning; if you’re the original and you do yourself the spike goes through the replica’s head, no? So how do you do Carcony at that point?
> The test build is an order of magnitude hardier than Nai’s older Cloud-Sheet. As such, the testing armature is equipped to apply enough pressure to pierce the Cloud-Sheet’s shielding, and so it was made possible for the instances to conspire to commit to the legal murder of Consenter Nai Scissors Bridger Glass Stratton.
So piercing the sheilding of the old brain (cloud-sheet) is important b/c the various Nai’s (ambiguous: all 4 or just 3 of them) are conspiring to murder normal-Nai and they need to pierce the cloud-sheet for that. But aren’t most new brains they test hardier than the one Nai is using? So isn’t it normal that the testing-spike could pierce her old brain?
> A few things happened in the wake of Consenter Paper Stratton’s act of praxis.
omit “act of”, sorta redundant.
> but most consenter-adjacent philosophers took the position that it was ridiculous to expect this to change the equations, that a cell with thrice the mass should be estimated to have about thrice the anthropic measure, no different.
This does not seem consistent with the universe. If that was the case then it would have been an issue going smaller and smaller to begin with, right?
Also, 3x lattices makes sense for error correction (like EC RAM), but not 3x mass.
> The consenter union banned the use of mirror chambers in any case where the reasonable scoring of the anthropic measure of the test build was higher than the reasonable scoring of a consenter’s existing build.
this presents a problem for testing better brains; curious if it’s going to be addressed.
I just noticed “Consenter Nai Paper-Chell-Glass-Stratton”—the ‘paper’ referrs to the rock-paper-sissors earlier (confirmed with a later Nai reference). She’s only done this 4 times now? (this being replication or the mirror chamber)
earlier “The rational decision for a selfish agent instead becomes...” is implying the rational decision is to execute the original—presumably this is an option the consenter always has? like they get to choose which one is killed? Why would that be an option? Why not just have a single button that when they both press it, the replica is died; no choice in the matter.
> Scissors: “I still don’t see it.”
Scissors is slower so scissors dies?
> Paper wonders, what does it feel like to be… more? If there were two of you, rather than just one, wouldn’t that mean something? What if there were another, but it were different?… so that-
I thought this was Paper thinking not wondering aloud. In that light
> Scissors: “What would it feel like? To be… More?… What if there were two of you, and one of me? Would you know?”
looks like partial mind reading or something, like super mental powers (which shouldn’t be a property of running a brain 3x over but I’m trying to find out why they concluded Scissors was the original)
> Each instance has now realised that the replica- its brain being physically more massive- has a higher expected anthropic measure than the original.
At this point in the story isn’t the idea that it has a higher anthropic measure b/c it’s 3 brains interleaved, not 1? while the parenthetical bit (“its brain … massive”) isn’t a reason? (Also, the mass thing that comes in later; what if they made 3 brains interleaved with the total mass of one older brain?)
Anyway, I suspect answering these issues won’t be necessary to get an idea of anthropic measure.
(continuing on)
> Anthropic measure really was the thing that caused consenter originals to kill themselves.
I don’t think this is rational FYI
> And if that wasn’t true of our garden, we would look out along the multiverse hierarchy and we would know how we were reflected infinitely, in all variations. > [...] > It became about relative quantities.
You can’t take relative quantities of infinities or subsets of infinities (it’s all 100% or 0%, essentially). You can have *measures*, though. David Deutsch’s Beginning of Infinity goes into some detail about this—both generally and wrt many worlds and the multiverse.
having no memories of being on the wrong side does not make this any more pleasant an experience to go through, nor does it provide any reassurance against being the replica (presuming that’s the one which is killed
It provides no real reassurance, but it would make it a more pleasant experience to go through. The effect of having observed the coin landing heads many times and never tails, is going to make it instinctively easy to let go of your fear of tails.
Um okay, wouldn’t they have maybe thought about this after 15 years of training and decades of practice in the field?
Possibly! I’m not sure how realistic that part is, to come to that realization while the thing is happening instead of long before, but it was kind of needed for the story. It’s at least conceivable that the academic culture of the consenters was always a bit inadequate, maybe Nai had heard murmurings about this before, then the murmurer quietly left the industry and the Nais didn’t take it seriously until they were living it.
I don’t think we’ve established sufficiently that the 3 minds 1 brain thing are actually 3 minds
The odds don’t have to be as high as 3:1 for the decision to come out the same way.
Holy shit this society is dystopic.
The horrors of collectively acknowledging that the accessible universe’s resources are finite and that growth must be governed in order to prevent malthus from coming back? Or is it more about the beurocracy of it, yeah, you’d really hope they’d be able to make exceptions for situations like this, hahaha (but, lacking an AGI singleton, my odds for society still having legal systems with this degree of rigidity are honestly genuinely pretty high, like yeah, uh, human societies are bad, it’s always been bad, it is bad right now, it feels normal if you live under it, you Respect The History and imagine it couldn’t be any other way)
Wouldn’t this be an abdication of responsibility as mentioned in the prev paragraph?
An abdication of the expectations of an employer, much less than an abdication of the law. Transfers to other substrates with the destruction of the original copy are legal, probably even commonplace in many communities.
I think this section is really confusing. They’re talking about killing the original, which the chamber is not set up to do, but they have an idea as to how to do it. The replica is just a brain, they will only experience impaling their brain with a rod and it wouldn’t actually happen. They would be sitting there with their brain leaking out while somehow still conscious.
omit “act of”, sorta redundant.
Praxis consists of many actions, the reaction was very much towards this specific act, this is when people noticed.
If that was the case then it would have been an issue going smaller and smaller to begin with, right?
They only started saying things like this after the event. Beforehand it would have been very easy to deny that there was an issue. Those with smaller brains had the same size bodies as everyone else, just as much political power, exactly the same behaviours.
She’s only done this 4 times now?
The other names are from different aspects of life. Some given by parents, some taken on as part of coming of age. Normally a person wouldn’t take on names from executing a mirror chamber, in this case it was because it was such a significant event and a lot of people came to know Nai as Paper from the mirror chamber transcript.
looks like partial mind reading or something
It’s because a lot of their macrostate is still in sync. This is a common experience in mirror chambers, so they don’t remark on it.
It certainly is about measures.
Well that sounds interesting. Measure theory is probably going to be crucial for getting firm answers to a lot of the questions I’ve come to feel responsible for. Not just anthropics, there’s some simulist multiverse stuff as well..
I don’t think this is rational FYI
How so?
I should probably mention, if I were just shoved into that situation… We probably wouldn’t kill the original. I wouldn’t especially care about having reduced measure or existing less, because I consider myself a steward of the eschaton rather than a creature of it, if anything, my anthropic measure should be decreased as much as possible so that the painful path of meaning and service that I have chosen has less of an impact on the beauty of the sum work. Does that make sense? If not never mind it’s not so important.
I’m going to need to find a way to justify making Paper and Scissors less ingroupy in the way they talk to each other, the premise that we would be able to eavesdrop a conversation between a philosopher and a copy of that philosopher and understand what they were on about is kind of not plausible and it comes through here. Maybe I should just not report the contents of the conversation, or turn that up to an extreme, have everything they say be comically terse and idiolectic, then explain it after the fact. Actually yes I think that would be great. Intimacy giving rise to ideolect needs to be depicted more.
This is commentary I started making as I was reading the first quote. I think some bits of the post are a bit vague or confusing but I think I get what you mean by anthropic measure, so it’s okay in service to that. I don’t think equating anthropic measure to mass makes sense, though; counter examples seem trivial.
> The two instances can make the decision together on equal footing, taking on exactly the same amount of risk, each- having memories of being on the right side of the mirror many times before, and no memories of being on the wrong- tacitly feeling that they will go on to live a long and happy life.
feels a bit like like quantum suicide.
note: having no memories of being on the wrong side does not make this any more pleasant an experience to go through, nor does it provide any reassurance against being the replica (presuming that’s the one which is killed).
> As is custom, the loser speaks first.
naming the characters papers and scissors is a neat idea.
> Paper wonders, what does it feel like to be… more? If there were two of you, rather than just one, wouldn’t that mean something? What if there were another, but it were different?… so that-
>
> [...]
>
> Scissors: “What would it feel like? To be… More?… What if there were two of you, and one of me? Would you know?”
isn’t paper thinking in 2nd person but then scissors in 1st? so paper is thinking about 2 ppl but scissors about 3 ppl?
> It was true. The build that plays host to the replica (provisionally named “Wisp-Complete”), unlike the original’s own build, is effectively three brains interleaved
Wait, does this now mean there’s 4 ppl? 3 in the replica and 1 in the non-replica?
> Each instance has now realised that the replica- its brain being physically more massive- has a higher expected anthropic measure than the original.
Um okay, wouldn’t they have maybe thought about this after 15 years of training and decades of practice in the field?
> It is no longer rational for a selfish agent in the position of either Paper nor Scissors to consent to the execution of the replica, because it is more likely than not, from either agent’s perspective, that they are the replica.
I’m not sure this follows in our universe (presuming it is rational when it’s like 1:1 instead of 3:1 or whatever). like I think it might take different rules of rationality or epistemology or something.
> Our consenters have had many many decades to come to terms with these sorts of situations.
Why are Paper and Scissors so hesitant then?
> That gives any randomly selected agent that has observed that it is in the mirror chamber a 3⁄4 majority probability of being the replica, rather than being the original.
I don’t think we’ve established sufficiently that the 3 minds 1 brain thing are actually 3 minds. I don’t think they qualify for that, yet.
> But aren’t our consenters perfectly willing to take on a hefty risk death in service of progress? No. Most Consenters aren’t. Selling one’s mind and right to life in exchange for capital would be illegal.
Why would it be a hefty risk? Isn’t it 0% chance of death? (the replicant is always the one killed)
> In a normal mirror chamber setup, when the original enters the mirror chamber, they are confident that it is the original who will walk out again. They are taking on no personal risk. None is expected, and none is required.
Okay we might be getting some answers soon.
> The obvious ways of defecting from protocol- an abdication of the responsibility of the consenter, a refusal to self-murder, an attempt to replicate without a replication license- are taken as nothing less than Carcony.
Holy shit this society is dystopic.
> It would be punished with the deaths of both copies and any ancestors of less than 10 years of divergence or equivalent.
O.O
> But if, somehow, the original were killed? What if neither instance of the Consenter signed for their replica’s execution, and the replica were left alive. That would not be considered Carcony. It would not even be considered theft- because a brain always belongs to its mind.
I’m definitely unclear on the process for deciding; wouldn’t like only one guillatine be set up and both parties affixed in place? (Moreover, why wouldn’t the replica just be a brain and not in a body, so no guillatine, and just fed visual inputs along with the mirror-simulation in the actual room—sounds feasilble)
> What if neither instance of the Consenter signed for their replica’s execution
Wouldn’t this be an abdication of responsibility as mentioned in the prev paragraph?
> So, do you see now? Do you see how Consenter Nai Paper-Chell-Glass-Stratton was motivated by a simple alignment of the payoff matrices?
Presumably to run away with other-nai-x3-in-a-jar-stratton?
> Paper: “You wouldn’t do that to me. Look… if you’re the original… And I do myself, and I’m the replica. I wont actually be dead, because if you destroy a representation of your brain case in a simulation that doesn’t actually destroy you in real life. I might not even desync. Whatever happens, I’ll still be alive after the penetration so I’ll know I’m the replica, but you might not know. It might look like I’m really dead. And you’ll have no incentive to follow through and do yourself at that point.”
> Scissors: “I still don’t see it.”
So both parties sign for the destruction of the replica, but only the legit Nai’s signing will actually trigger the death of the replica. The replica Nai’s signing will only SIMULATE the death of a simulated replica Nai (the “real” Nai being untouched) - though if this happened wouldn’t they ‘desync’ - like not be able to communicate? (presuming I understand your meaning of desync)
> Paper: ”… If you’re the replica, it doesn’t matter whether you do yourself, you’ll still get saved either way, but you’re incented not to do yourself because having a simulated spike stuck through your simulated head will probably be pretty uncomfortable. But also, if you’re the original, you’re sort of doomed either way, you’re just incented to run off and attempt Carcony, but there’s no way the replica would survive you doing that, and you probably wouldn’t either, you wouldn’t do that to me. Would you?”
I don’t follow the “original” reasoning; if you’re the original and you do yourself the spike goes through the replica’s head, no? So how do you do Carcony at that point?
> The test build is an order of magnitude hardier than Nai’s older Cloud-Sheet. As such, the testing armature is equipped to apply enough pressure to pierce the Cloud-Sheet’s shielding, and so it was made possible for the instances to conspire to commit to the legal murder of Consenter Nai Scissors Bridger Glass Stratton.
So piercing the sheilding of the old brain (cloud-sheet) is important b/c the various Nai’s (ambiguous: all 4 or just 3 of them) are conspiring to murder normal-Nai and they need to pierce the cloud-sheet for that. But aren’t most new brains they test hardier than the one Nai is using? So isn’t it normal that the testing-spike could pierce her old brain?
> A few things happened in the wake of Consenter Paper Stratton’s act of praxis.
omit “act of”, sorta redundant.
> but most consenter-adjacent philosophers took the position that it was ridiculous to expect this to change the equations, that a cell with thrice the mass should be estimated to have about thrice the anthropic measure, no different.
This does not seem consistent with the universe. If that was the case then it would have been an issue going smaller and smaller to begin with, right?
Also, 3x lattices makes sense for error correction (like EC RAM), but not 3x mass.
> The consenter union banned the use of mirror chambers in any case where the reasonable scoring of the anthropic measure of the test build was higher than the reasonable scoring of a consenter’s existing build.
this presents a problem for testing better brains; curious if it’s going to be addressed.
I just noticed “Consenter Nai Paper-Chell-Glass-Stratton”—the ‘paper’ referrs to the rock-paper-sissors earlier (confirmed with a later Nai reference). She’s only done this 4 times now? (this being replication or the mirror chamber)
earlier “The rational decision for a selfish agent instead becomes...” is implying the rational decision is to execute the original—presumably this is an option the consenter always has? like they get to choose which one is killed? Why would that be an option? Why not just have a single button that when they both press it, the replica is died; no choice in the matter.
> Scissors: “I still don’t see it.”
Scissors is slower so scissors dies?
> Paper wonders, what does it feel like to be… more? If there were two of you, rather than just one, wouldn’t that mean something? What if there were another, but it were different?… so that-
I thought this was Paper thinking not wondering aloud. In that light
> Scissors: “What would it feel like? To be… More?… What if there were two of you, and one of me? Would you know?”
looks like partial mind reading or something, like super mental powers (which shouldn’t be a property of running a brain 3x over but I’m trying to find out why they concluded Scissors was the original)
> Each instance has now realised that the replica- its brain being physically more massive- has a higher expected anthropic measure than the original.
At this point in the story isn’t the idea that it has a higher anthropic measure b/c it’s 3 brains interleaved, not 1? while the parenthetical bit (“its brain … massive”) isn’t a reason? (Also, the mass thing that comes in later; what if they made 3 brains interleaved with the total mass of one older brain?)
Anyway, I suspect answering these issues won’t be necessary to get an idea of anthropic measure.
(continuing on)
> Anthropic measure really was the thing that caused consenter originals to kill themselves.
I don’t think this is rational FYI
> And if that wasn’t true of our garden, we would look out along the multiverse hierarchy and we would know how we were reflected infinitely, in all variations.
> [...]
> It became about relative quantities.
You can’t take relative quantities of infinities or subsets of infinities (it’s all 100% or 0%, essentially). You can have *measures*, though. David Deutsch’s Beginning of Infinity goes into some detail about this—both generally and wrt many worlds and the multiverse.
It provides no real reassurance, but it would make it a more pleasant experience to go through. The effect of having observed the coin landing heads many times and never tails, is going to make it instinctively easy to let go of your fear of tails.
Possibly! I’m not sure how realistic that part is, to come to that realization while the thing is happening instead of long before, but it was kind of needed for the story. It’s at least conceivable that the academic culture of the consenters was always a bit inadequate, maybe Nai had heard murmurings about this before, then the murmurer quietly left the industry and the Nais didn’t take it seriously until they were living it.
The odds don’t have to be as high as 3:1 for the decision to come out the same way.
The horrors of collectively acknowledging that the accessible universe’s resources are finite and that growth must be governed in order to prevent malthus from coming back? Or is it more about the beurocracy of it, yeah, you’d really hope they’d be able to make exceptions for situations like this, hahaha (but, lacking an AGI singleton, my odds for society still having legal systems with this degree of rigidity are honestly genuinely pretty high, like yeah, uh, human societies are bad, it’s always been bad, it is bad right now, it feels normal if you live under it, you Respect The History and imagine it couldn’t be any other way)
An abdication of the expectations of an employer, much less than an abdication of the law. Transfers to other substrates with the destruction of the original copy are legal, probably even commonplace in many communities.
I think this section is really confusing. They’re talking about killing the original, which the chamber is not set up to do, but they have an idea as to how to do it. The replica is just a brain, they will only experience impaling their brain with a rod and it wouldn’t actually happen. They would be sitting there with their brain leaking out while somehow still conscious.
Praxis consists of many actions, the reaction was very much towards this specific act, this is when people noticed.
They only started saying things like this after the event. Beforehand it would have been very easy to deny that there was an issue. Those with smaller brains had the same size bodies as everyone else, just as much political power, exactly the same behaviours.
The other names are from different aspects of life. Some given by parents, some taken on as part of coming of age. Normally a person wouldn’t take on names from executing a mirror chamber, in this case it was because it was such a significant event and a lot of people came to know Nai as Paper from the mirror chamber transcript.
It’s because a lot of their macrostate is still in sync. This is a common experience in mirror chambers, so they don’t remark on it.
It certainly is about measures.
Well that sounds interesting. Measure theory is probably going to be crucial for getting firm answers to a lot of the questions I’ve come to feel responsible for. Not just anthropics, there’s some simulist multiverse stuff as well..
How so?
I should probably mention, if I were just shoved into that situation… We probably wouldn’t kill the original. I wouldn’t especially care about having reduced measure or existing less, because I consider myself a steward of the eschaton rather than a creature of it, if anything, my anthropic measure should be decreased as much as possible so that the painful path of meaning and service that I have chosen has less of an impact on the beauty of the sum work. Does that make sense? If not never mind it’s not so important.
I’m going to need to find a way to justify making Paper and Scissors less ingroupy in the way they talk to each other, the premise that we would be able to eavesdrop a conversation between a philosopher and a copy of that philosopher and understand what they were on about is kind of not plausible and it comes through here. Maybe I should just not report the contents of the conversation, or turn that up to an extreme, have everything they say be comically terse and idiolectic, then explain it after the fact. Actually yes I think that would be great. Intimacy giving rise to ideolect needs to be depicted more.