If there’s nothing implemented on it, is it still a brain?
If you have a pattern of silicon that doesn’t perform computations, is it still a chip?
Depending on how you answer that question, “You are an aspect of a homo sapiens brain” would work better.
In any case, we all seem to understand what we’re talking about, and the difference in our definitions isn’t something we’re likely to come by, so they’re both carving reality at it’s joints.
You are a mammal, and all that is within your skin is you. This includes the unconscious bits, as well as the conscious running dialogue in your head. This includes all your other organs, whose functioning affects the functioning of your brain.
Sure, why not? If there were no gas-producing bacteria living in our guts, we would not have such an appreciation of fart humor. Thus, the habits of some other organisms living inside us do have some effect on the contents of our experience.
If I was a brain in a vat, and had the same arrangement of neurons as I do now, I’d be just as appreciative of fart humor. I just wouldn’t be able to fart. I might eventually lose interest because of that, but that’s because my brain is changing.
Your body does influence you somewhat, especially the parts that secrete hormones, but then, everything that isn’t your body does too.
It’s not really obvious where “you” begins and ends, but we can at least say that it’s mostly the brain.
Your body does influence you somewhat, especially the parts that secrete hormones, but then, everything that isn’t your body does too.
We are connected with the world. Most trivially, if we wouldn’t have food, oxygen, and temperature in a given range, we would die. Less trivially, we are influenced by friends, television, amount of light during the day, etc. Similarly we are influenced by our internal organs: pain, digestion, ebriety, fatigue.
You remove any of this, and the person changes, more or less. Conventionally, very big change is usually fatal, we say the person has died. Small changes in environment (external or internal) often cause temporary changes, we call this mood—a person with a different mood is considered the same person; their functions are a bit different, but with a high probability sooner or later their mood will change again. So it seems like a clear line between a “macrochange” (death) and a “microchange” (mood).
During longer time, microchanges can accumulate to something that no longer feels like a microchange, because we cannot expect it to change back. We say that people grow and their personality changes. This can sometimes feel discomforting, but it’s considered normal.
In the right environment, the microchanges can accumulate faster; it can be called brainwashing (when intentionally caused by other people) or a sudden change of personality (when caused by drugs, illness, brain damage), and this idea is very discomforting for other people.
What does this all mean? The boundary between “me” and “not me” is not completely clear. There is a body, which under normal conditions cannot split or merge with other bodies, so this is a conventional base for identity (other base of identity is the memory). But the mind inside the body is changing. By uploading we are trying to preserve the mind and discard the body—trying to discard the most obvious conventional source of identity, and preserve the more fluid ones (memory, personality).
Every part removed means change; generally, more removed parts mean greater change. With different inputs, your outputs will also become different. It starts with “I cannot fart” and continues with “I never feel happiness or sadness”, unless you have a farting interface or a hormone-generating interface. If you cannot fart anymore, you have changed a bit. If you cannot feel happiness, desire, love, curiosity, you have changed more.
Perhaps we could try to estimate how much various factors contribute to our personality, so we could aim to preserve, say, 80% of the personality. We could decide to sacrifice farting, but preserve mood changes. Anyway, it would be just the first iteration. In the second iteration, the trans-human individual could decide to remove some unpleasant moods and add more pleasant ones, etc. If it happens in many iterations, slowly enough, we will feel that it is normal. If it happens faster, it will be discomforting for other people. After long enough time, there will be only a shared path (not even a single shared identity, if copying and merging ems becomes possible).
But this is not an argument against uploading. Even if my mind is destined to disappear—either by quick change at the moment of death, or by accumulated slow changes after uploading—I would prefer the slow way, if I had a choice, because that idea feels less painful. It’s just a warning that any form of uploading will change the personality, or at least enable future changes, which may feel OK for the person changing, but may be shocking to outside observers.
The statement “You are a brain” means that your brain is the part of you that is essential to your identity. This is not entirely accurate, and other threads in this discussion address some clarifications. But essentially, it makes the point that an injury that destroys part of your brain would cause you to be a different person in a way that the loss of a limb would not.
Suppose I define myself otherwise, identifying only with my mind rather than body. Would there any reason to argue your definition is better?
And before anyone tries to remind me that the mind isn’t separate from the body—consider that it’s still useful to talk about computer programs as computer programs, separately from the hardware that runs them, even though these programs cannot run except on hardware.
Most computer programs are not very platform specific, and so hardware is whatever approximation to a Turing machine you have handy. But if code is embedded in a platform- to the point that it will not run on any other platform- how meaningful is it to discuss the difference between software and hardware?
My mental experience is of being a body, and so it’s not clear to me what it would be to exist purely mentally.
I meant that in an engineering sense, not a theoretical one, and deliberately moved from “computer program” to “code.” If I have Lisp code that I want to run, and all I can run it in is C, it’s not going to work. To get it to work, it’s often easier to write a Lisp interpreter in C and use the old code than rewrite the code myself. And that’s two languages intended to be used by humans and operate on silicon substrates with binary logic.
And so, can you write a ‘human interpreter’ on silicon with binary logic? Theoretically, sure. Practically, there might not be enough silicon to faithfully emulate it in anything close to realtime. But even if you manage it, you’ve just moved the platform into the realm of software- you haven’t divorced the code and the platform.
My informed but ultimately presumably inaccurate guess is: if I buy about a million or so high end GPUs, and a few hundred petabytes of hard drives, I am somewhere in the ballpark of a human brain.
Given Moores law, that number is going to diminish. Given more knowledge about neurology valuable reductions in simulation complexity will be possible; you probably won’t need a chromodynamics simulation to accurately replicate personality, the thermal noise in our brainware is far too great to depend on that kind of accuracy.
But yes, a human interpreter is ultimately possible because human minds are neuron actvity and neuron activity is physics and physics are as far as we know, turing computable.
Either way, what I’m trying to argue against is the “you are an organism” thing. Not “everything within my skin” is necessary to run the program, I mean surely not the colon or metatarsal bones? To me it makes little more sense to call the entire body “me” than my car. Either way it’s a vehicle, even if at the present state of technology I’m kind of stuck with this one.
I suspect that when we, in a hurry to signal allegiance to reductionism and materialism, tell people things like “you are an organism” (or even “you are a brain”, unless a proper explanation follows), many among those listeners who are actually interested in truth (rather than just in absorbing acceptable beliefs of the Tribe of Scientifically Literate) will reasonably feel there’s something not quite right about this, that it dismisses something that’s actually important. They might not say so, being ashamed of possibly being seen as believing in souls or some other silly nonphysicalism, but it will still not ring true to them.
So I think getting this right matters. Otherwise we’re helping fuel the resistance to reductionism among the unconvinced.
Depending on what it meant by the question, either “I’m a pattern of information encoded in my brain” or “I’m a side-effect of processing that my brain does for it’s own reasons” are the one-sentence descriptions I’d use.
I identify with the meat that currently contains/creates/executes me, but not perfectly—I don’t expect that replacing the meat with a sufficiently-similar replacement would alter the experience of being me.
Identity, of course is a continuum, not a binary measure. A different brain with the same patterns and inputs would be so much like me that I’d call them the same. But really it might be no more similar than future me and past me in the “same” body.
I’m a side-effect of processing that my brain does for it’s own reasons
What do you mean “its own reasons”? Do you mean that you exist purely because of your circumstances, or something along the lines of your brain makes decisions, and you’re just the qualia it makes when it does it.
“You Are A Brain” is not strictly accurate, to my mind, but it’s catchy and sufficiently less wrong to be useful as a hook for the concepts.
My preferred clarification is “I’m not my brain, I’m something my brain is doing.”
That’s right. The more accurate, less catchy title would be “You Are Implemented On A Homo Sapien Brain”.
If there’s nothing implemented on it, is it still a brain?
If you have a pattern of silicon that doesn’t perform computations, is it still a chip?
Depending on how you answer that question, “You are an aspect of a homo sapiens brain” would work better.
In any case, we all seem to understand what we’re talking about, and the difference in our definitions isn’t something we’re likely to come by, so they’re both carving reality at it’s joints.
No, “you are an organism”.
You are a mammal, and all that is within your skin is you. This includes the unconscious bits, as well as the conscious running dialogue in your head. This includes all your other organs, whose functioning affects the functioning of your brain.
Does this include the other organisms inside my skin?
Sure, why not? If there were no gas-producing bacteria living in our guts, we would not have such an appreciation of fart humor. Thus, the habits of some other organisms living inside us do have some effect on the contents of our experience.
If I was a brain in a vat, and had the same arrangement of neurons as I do now, I’d be just as appreciative of fart humor. I just wouldn’t be able to fart. I might eventually lose interest because of that, but that’s because my brain is changing.
Your body does influence you somewhat, especially the parts that secrete hormones, but then, everything that isn’t your body does too.
It’s not really obvious where “you” begins and ends, but we can at least say that it’s mostly the brain.
We are connected with the world. Most trivially, if we wouldn’t have food, oxygen, and temperature in a given range, we would die. Less trivially, we are influenced by friends, television, amount of light during the day, etc. Similarly we are influenced by our internal organs: pain, digestion, ebriety, fatigue.
You remove any of this, and the person changes, more or less. Conventionally, very big change is usually fatal, we say the person has died. Small changes in environment (external or internal) often cause temporary changes, we call this mood—a person with a different mood is considered the same person; their functions are a bit different, but with a high probability sooner or later their mood will change again. So it seems like a clear line between a “macrochange” (death) and a “microchange” (mood).
During longer time, microchanges can accumulate to something that no longer feels like a microchange, because we cannot expect it to change back. We say that people grow and their personality changes. This can sometimes feel discomforting, but it’s considered normal.
In the right environment, the microchanges can accumulate faster; it can be called brainwashing (when intentionally caused by other people) or a sudden change of personality (when caused by drugs, illness, brain damage), and this idea is very discomforting for other people.
What does this all mean? The boundary between “me” and “not me” is not completely clear. There is a body, which under normal conditions cannot split or merge with other bodies, so this is a conventional base for identity (other base of identity is the memory). But the mind inside the body is changing. By uploading we are trying to preserve the mind and discard the body—trying to discard the most obvious conventional source of identity, and preserve the more fluid ones (memory, personality).
Every part removed means change; generally, more removed parts mean greater change. With different inputs, your outputs will also become different. It starts with “I cannot fart” and continues with “I never feel happiness or sadness”, unless you have a farting interface or a hormone-generating interface. If you cannot fart anymore, you have changed a bit. If you cannot feel happiness, desire, love, curiosity, you have changed more.
Perhaps we could try to estimate how much various factors contribute to our personality, so we could aim to preserve, say, 80% of the personality. We could decide to sacrifice farting, but preserve mood changes. Anyway, it would be just the first iteration. In the second iteration, the trans-human individual could decide to remove some unpleasant moods and add more pleasant ones, etc. If it happens in many iterations, slowly enough, we will feel that it is normal. If it happens faster, it will be discomforting for other people. After long enough time, there will be only a shared path (not even a single shared identity, if copying and merging ems becomes possible).
But this is not an argument against uploading. Even if my mind is destined to disappear—either by quick change at the moment of death, or by accumulated slow changes after uploading—I would prefer the slow way, if I had a choice, because that idea feels less painful. It’s just a warning that any form of uploading will change the personality, or at least enable future changes, which may feel OK for the person changing, but may be shocking to outside observers.
True!
The statement “You are a brain” means that your brain is the part of you that is essential to your identity. This is not entirely accurate, and other threads in this discussion address some clarifications. But essentially, it makes the point that an injury that destroys part of your brain would cause you to be a different person in a way that the loss of a limb would not.
Suppose I define myself otherwise, identifying only with my mind rather than body. Would there any reason to argue your definition is better?
And before anyone tries to remind me that the mind isn’t separate from the body—consider that it’s still useful to talk about computer programs as computer programs, separately from the hardware that runs them, even though these programs cannot run except on hardware.
Most computer programs are not very platform specific, and so hardware is whatever approximation to a Turing machine you have handy. But if code is embedded in a platform- to the point that it will not run on any other platform- how meaningful is it to discuss the difference between software and hardware?
My mental experience is of being a body, and so it’s not clear to me what it would be to exist purely mentally.
That’s strikes me as a really big if. I’m not sure if this is even theoretically possible.
I meant that in an engineering sense, not a theoretical one, and deliberately moved from “computer program” to “code.” If I have Lisp code that I want to run, and all I can run it in is C, it’s not going to work. To get it to work, it’s often easier to write a Lisp interpreter in C and use the old code than rewrite the code myself. And that’s two languages intended to be used by humans and operate on silicon substrates with binary logic.
And so, can you write a ‘human interpreter’ on silicon with binary logic? Theoretically, sure. Practically, there might not be enough silicon to faithfully emulate it in anything close to realtime. But even if you manage it, you’ve just moved the platform into the realm of software- you haven’t divorced the code and the platform.
My informed but ultimately presumably inaccurate guess is: if I buy about a million or so high end GPUs, and a few hundred petabytes of hard drives, I am somewhere in the ballpark of a human brain.
Given Moores law, that number is going to diminish. Given more knowledge about neurology valuable reductions in simulation complexity will be possible; you probably won’t need a chromodynamics simulation to accurately replicate personality, the thermal noise in our brainware is far too great to depend on that kind of accuracy.
But yes, a human interpreter is ultimately possible because human minds are neuron actvity and neuron activity is physics and physics are as far as we know, turing computable.
That’s what I think, too.
Either way, what I’m trying to argue against is the “you are an organism” thing. Not “everything within my skin” is necessary to run the program, I mean surely not the colon or metatarsal bones? To me it makes little more sense to call the entire body “me” than my car. Either way it’s a vehicle, even if at the present state of technology I’m kind of stuck with this one.
I suspect that when we, in a hurry to signal allegiance to reductionism and materialism, tell people things like “you are an organism” (or even “you are a brain”, unless a proper explanation follows), many among those listeners who are actually interested in truth (rather than just in absorbing acceptable beliefs of the Tribe of Scientifically Literate) will reasonably feel there’s something not quite right about this, that it dismisses something that’s actually important. They might not say so, being ashamed of possibly being seen as believing in souls or some other silly nonphysicalism, but it will still not ring true to them.
So I think getting this right matters. Otherwise we’re helping fuel the resistance to reductionism among the unconvinced.
I’m actually sure it is not theoretically possible.
Depending on what it meant by the question, either “I’m a pattern of information encoded in my brain” or “I’m a side-effect of processing that my brain does for it’s own reasons” are the one-sentence descriptions I’d use.
I identify with the meat that currently contains/creates/executes me, but not perfectly—I don’t expect that replacing the meat with a sufficiently-similar replacement would alter the experience of being me.
Identity, of course is a continuum, not a binary measure. A different brain with the same patterns and inputs would be so much like me that I’d call them the same. But really it might be no more similar than future me and past me in the “same” body.
What do you mean “its own reasons”? Do you mean that you exist purely because of your circumstances, or something along the lines of your brain makes decisions, and you’re just the qualia it makes when it does it.