What’s your estimate of H/D?
My intuitive sense is that it’s <2.
I would think it’s far higher then that. Probably H/D>100, and it might be far higher then that. I tend to think that maintaining some continuity of identity would be very important to uploaded minds (because, honestly, isn’t that the whole point of uploading your mind instead of just emulating a random human-like mind?). I also tend to think that there are vast categorizes of experiences that you would not put yourself through just so you could be the kind of person who had been through that experience; if there are mind-states that can only be reached by, say, “losing a child and then overcoming that horrible experience after years of grieving through developing a kind of inner strength”, the I can’t imagine any mind would intentionally do that to themselves just to explore more sections of mind-space.
Or, think about it in terms of beliefs. Say that mind A is an atheist. Do you think that the person who has mind A would ever intentionally turn themselves into a theist or into a spiritualist just in order to experience those emotions, and to get to places in mind-space that can only be reached from there? Judging from the whole of human experience, by just preventing yourself from going that route, you’re probably eliminating at least half of all mind-states that a normal human can reach; many of those mind-states being states that can apparently produce incredibly interesting culture, music, literature, art, ect. Not to mention all the possible mind-states that can only be reached by being a former theist who has lost his faith. And that’s just one example; there are probably dozens or hundreds of beliefs, values, and worldviews that any mind has that it would never want to change, because they are simply too fundamental to that mind’s basic identity. Even with basic things; Eliezer once mentioned, when talking about FAI theory. “My name is Eliezer Yudkowsky. Perhaps it would be easier if my name was something shorter and easier to remember, but I don’t want to change my name. And I don’t want to change into a person who would want to change my name.” (That’s not an exact quote, but it was something along those lines.) I would be surprised if any decent of your mind would ever get to even 1% of all possible human mind-space.
Not only that, if mind A has a certain set of values and beliefs, and then you make a million copies of mind A and they all interact with each other all the time, I would think that would tend to discourage any of them from changing or questioning those values or beliefs. Usually the main way people change their minds is when they encounter someone with fundamentally different beliefs who seems to be intelligent and worth listening to; on the other hand, if you surround yourself with only people who believe the same thing you do, you are very unlikely to ever change that belief; if anything, social pressure would likely lock it into place. Therefore, I would say that a mind that primarily interacts with other copes of itself would be far more likely to become static and unchanging then that same mind in an environment where it is interacting with other minds with different beliefs.
I can’t imagine any mind would intentionally do that to themselves just to explore more sections of mind-space.
Mm. That’s interesting. While I can’t imagine actually arranging for my child to die in order to explore that experience, I can easily imagine going through that experience (e.g., with some kind of simulated person) if I thought I had a reasonable chance of learning something worthwhile in the process, if I were living in a post-scarcity kind of environment.
I can similarly easily imagine myself temporarily adopting various forms of theism, atheism, former-theism, and all kinds of other mental states.
And I can even more easily imagine encouraging clones of myself to do so, or choosing to do so when there’s a community of clones of myself already exploring other available paths. Why choose a path that’s already being explored by someone else?
It sounds like we’re both engaging in mind projection here… you can’t imagine a mind being willing to choose these sorts of many-sigmas-out experiences, so you assume a population of clone-minds would stick pretty close to a norm; I can easily imagine a mind choosing them, so I assume a population of clone-minds would cover most of the available space.
And it may well be that you’re more correct about what clones of an arbitrarily chosen mind would be like… that is, I may just be an aberrant data point.
I can easily imagine a mind choosing them, so I assume a population of clone-minds would cover most of the available space.
Ok, so let’s say for the sake of arguments that you’re more flexible about such things then 90% of the population is. If so, would you be willing to modify yourself into someone less flexible, into someone who never would want to change himself? If you don’t, then you’ve just locked yourself out of about 90% of all possible mindspace on that one issue alone. However, if you do, then you’re probably stuck in that state for good; the new you probably wouldn’t want to change back.
Absolutely… temporarily being far more rigid-minded than I am would be fascinating. And knowing that the alarm was ticking and that I was going to return to being my ordinary way of being would likely be deliciously terrifying, like a serious version of a roller coaster.
But, sure, if we posit that the technology is limited such that temporary changes of this sort aren’t possible, then I wouldn’t do that if I were the only one of me… though if there were a million of me around, I might.
I would think it’s far higher then that. Probably H/D>100, and it might be far higher then that. I tend to think that maintaining some continuity of identity would be very important to uploaded minds (because, honestly, isn’t that the whole point of uploading your mind instead of just emulating a random human-like mind?). I also tend to think that there are vast categorizes of experiences that you would not put yourself through just so you could be the kind of person who had been through that experience; if there are mind-states that can only be reached by, say, “losing a child and then overcoming that horrible experience after years of grieving through developing a kind of inner strength”, the I can’t imagine any mind would intentionally do that to themselves just to explore more sections of mind-space.
Or, think about it in terms of beliefs. Say that mind A is an atheist. Do you think that the person who has mind A would ever intentionally turn themselves into a theist or into a spiritualist just in order to experience those emotions, and to get to places in mind-space that can only be reached from there? Judging from the whole of human experience, by just preventing yourself from going that route, you’re probably eliminating at least half of all mind-states that a normal human can reach; many of those mind-states being states that can apparently produce incredibly interesting culture, music, literature, art, ect. Not to mention all the possible mind-states that can only be reached by being a former theist who has lost his faith. And that’s just one example; there are probably dozens or hundreds of beliefs, values, and worldviews that any mind has that it would never want to change, because they are simply too fundamental to that mind’s basic identity. Even with basic things; Eliezer once mentioned, when talking about FAI theory. “My name is Eliezer Yudkowsky. Perhaps it would be easier if my name was something shorter and easier to remember, but I don’t want to change my name. And I don’t want to change into a person who would want to change my name.” (That’s not an exact quote, but it was something along those lines.) I would be surprised if any decent of your mind would ever get to even 1% of all possible human mind-space.
Not only that, if mind A has a certain set of values and beliefs, and then you make a million copies of mind A and they all interact with each other all the time, I would think that would tend to discourage any of them from changing or questioning those values or beliefs. Usually the main way people change their minds is when they encounter someone with fundamentally different beliefs who seems to be intelligent and worth listening to; on the other hand, if you surround yourself with only people who believe the same thing you do, you are very unlikely to ever change that belief; if anything, social pressure would likely lock it into place. Therefore, I would say that a mind that primarily interacts with other copes of itself would be far more likely to become static and unchanging then that same mind in an environment where it is interacting with other minds with different beliefs.
Mm. That’s interesting. While I can’t imagine actually arranging for my child to die in order to explore that experience, I can easily imagine going through that experience (e.g., with some kind of simulated person) if I thought I had a reasonable chance of learning something worthwhile in the process, if I were living in a post-scarcity kind of environment.
I can similarly easily imagine myself temporarily adopting various forms of theism, atheism, former-theism, and all kinds of other mental states.
And I can even more easily imagine encouraging clones of myself to do so, or choosing to do so when there’s a community of clones of myself already exploring other available paths. Why choose a path that’s already being explored by someone else?
It sounds like we’re both engaging in mind projection here… you can’t imagine a mind being willing to choose these sorts of many-sigmas-out experiences, so you assume a population of clone-minds would stick pretty close to a norm; I can easily imagine a mind choosing them, so I assume a population of clone-minds would cover most of the available space.
And it may well be that you’re more correct about what clones of an arbitrarily chosen mind would be like… that is, I may just be an aberrant data point.
Ok, so let’s say for the sake of arguments that you’re more flexible about such things then 90% of the population is. If so, would you be willing to modify yourself into someone less flexible, into someone who never would want to change himself? If you don’t, then you’ve just locked yourself out of about 90% of all possible mindspace on that one issue alone. However, if you do, then you’re probably stuck in that state for good; the new you probably wouldn’t want to change back.
Absolutely… temporarily being far more rigid-minded than I am would be fascinating. And knowing that the alarm was ticking and that I was going to return to being my ordinary way of being would likely be deliciously terrifying, like a serious version of a roller coaster.
But, sure, if we posit that the technology is limited such that temporary changes of this sort aren’t possible, then I wouldn’t do that if I were the only one of me… though if there were a million of me around, I might.