I think this argument mostly centers on the definition of certain words, and thus does not change my views on whether I should upload my mind if given the choice.
But can this person be said to understand Chinese? My answer is no.
What you have shown here is what you think the word “understands” means. But everyone agrees about the physical situation here—everyone anticipates the same experiences.
This shows that our brains are highly resilient and adaptive to changes experienced by our minds. By comparison, a digital simulation is very brittle and non-adaptive to change.
The substrate of the simulation, ie. a silicon chip, is brittle (at our current level of tech) but it can still run a simulation of a neuroplastic brain—just program it to simulate the brain chemistry. Then if the simulated brain is damaged, it will be able to adapt.
The bigger point here is that you are implicitly asserting that in order to be “sentient” a mind must have similar properties to a human brain. That’s fine, but it’s is purely a statement about how you like to define the word “sentient”.
Only living organisms can possess sentience because sentience provides introspective knowledge that enables them to keep surviving;
“Sentience” has no widely agreed concrete definition, but I think it would be relatively unusual to say it “provides introspective knowledge”. Do you agree that any questions about the actual computation, algorithms or knowledge in a brain can be answered by only considering the physical implementation of neurons and synapses?
sentience would not emerge in artificial systems because they are not alive in the first place.
Again, I think this is purely a statement about the definition of the word “alive”. Someone who disagrees would not anticipate any different experiences as a consequence of thinking an artificial system is “alive”.
I think this argument mostly centers on the definition of certain words, and thus does not change my views on whether I should upload my mind if given the choice.
What you have shown here is what you think the word “understands” means. But everyone agrees about the physical situation here—everyone anticipates the same experiences.
The substrate of the simulation, ie. a silicon chip, is brittle (at our current level of tech) but it can still run a simulation of a neuroplastic brain—just program it to simulate the brain chemistry. Then if the simulated brain is damaged, it will be able to adapt.
The bigger point here is that you are implicitly asserting that in order to be “sentient” a mind must have similar properties to a human brain. That’s fine, but it’s is purely a statement about how you like to define the word “sentient”.
“Sentience” has no widely agreed concrete definition, but I think it would be relatively unusual to say it “provides introspective knowledge”. Do you agree that any questions about the actual computation, algorithms or knowledge in a brain can be answered by only considering the physical implementation of neurons and synapses?
Again, I think this is purely a statement about the definition of the word “alive”. Someone who disagrees would not anticipate any different experiences as a consequence of thinking an artificial system is “alive”.