The claim that uploaded brains don’t work because of chaos turns out not to work so well, because it’s usually easier to control the divergence than it is to predict the divergence, because you can use strategies like fast-feedback control to prevent yourself from ever getting into the chaotic region, and more generally a lot of misapplication of chaos theory starts by incorrectly assuming that hardness of prediction equals hardness of controlling it, without other assumptions:
I think I might have also once saw this exact example of repeated-bouncing-balls done with robot control demonstrating how even with apparently-stationary plates (maybe using electromagnets instead of tilting?), a tiny bit of high-speed computer vision control could beat the chaos and make it bounce accurately many more times than the naive calculation says is possible, but I can’t immediately refind it.
To call something an “uploaded brain” is to make two claims. First, that it is a (stable) mind. Second, that it is in some important sense equivalent to a particular meat brain (e.g., that its output is the same as the meat brain, or that its experiences are the same as the meat brain’s). The sorts of methods you’re talking about to stabilize the mind help with the first claim, but not with the second.
I’ve always struggled to make sense of the idea of brain uploading because it seems to rely on some sort of dualism. As a materialist, it seems obvious to me that a brain is a brain, a program that replicates the brain’s output is a program (and will perform its task more or less well but probably not perfectly), and the two are not the same.
I’ve always struggled to make sense of the idea of brain uploading because it seems to rely on some sort of dualism. As a materialist, it seems obvious to me that a brain is a brain, a program that replicates the brain’s output is a program (and will perform its task more or less well but probably not perfectly), and the two are not the same.
I think that basically everything in the universe can be considered a program/computation, but I also think the notion of a program/computation is quite trivial.
More substantively, I think it might be possible to replicate at least some parts of the physical world with future computers that have what is called physical universality, where they can manipulate the physical world essentially arbitrarily.
So I don’t view brains and computer programs as being of 2 different types, but rather as the same type as a program/computation.
The ancients considered everything to be the work of spirits. The medievals considered the cosmos to be a kingdom. Early moderns likened the universe to a machine. Every age has its dominant metaphors. All of them are oversimplifications of a more complex truth.
Suppose you had an identical twin with identical genes and, until very recently, an identical history. From the perspective of anyone else, you’re similar enough to be interchangeable with each other. But from your perspective, the twin would be a different person.
The brain is you, full stop. It isn’t running a computer program; its hardware and software are inseparable and developed together over the course of your life. In other words, the hardware/software distinction doesn’t apply to brains.
The claim that uploaded brains don’t work because of chaos turns out not to work so well, because it’s usually easier to control the divergence than it is to predict the divergence, because you can use strategies like fast-feedback control to prevent yourself from ever getting into the chaotic region, and more generally a lot of misapplication of chaos theory starts by incorrectly assuming that hardness of prediction equals hardness of controlling it, without other assumptions:
See more below:
https://www.lesswrong.com/posts/epgCXiv3Yy3qgcsys/you-can-t-predict-a-game-of-pinball#wjLFhiWWacByqyu6a
I also like tailcalled’s comment on the situation, too.
To call something an “uploaded brain” is to make two claims. First, that it is a (stable) mind. Second, that it is in some important sense equivalent to a particular meat brain (e.g., that its output is the same as the meat brain, or that its experiences are the same as the meat brain’s). The sorts of methods you’re talking about to stabilize the mind help with the first claim, but not with the second.
I’ve always struggled to make sense of the idea of brain uploading because it seems to rely on some sort of dualism. As a materialist, it seems obvious to me that a brain is a brain, a program that replicates the brain’s output is a program (and will perform its task more or less well but probably not perfectly), and the two are not the same.
I think the crux of it is here:
I think that basically everything in the universe can be considered a program/computation, but I also think the notion of a program/computation is quite trivial.
More substantively, I think it might be possible to replicate at least some parts of the physical world with future computers that have what is called physical universality, where they can manipulate the physical world essentially arbitrarily.
So I don’t view brains and computer programs as being of 2 different types, but rather as the same type as a program/computation.
See below for some intuition as to why.
http://www.amirrorclear.net/academic/ideas/simulation/index.html
The ancients considered everything to be the work of spirits. The medievals considered the cosmos to be a kingdom. Early moderns likened the universe to a machine. Every age has its dominant metaphors. All of them are oversimplifications of a more complex truth.
There are no properties of brain which define that brain is “you”, except for the program that it runs.
Suppose you had an identical twin with identical genes and, until very recently, an identical history. From the perspective of anyone else, you’re similar enough to be interchangeable with each other. But from your perspective, the twin would be a different person.
The brain is you, full stop. It isn’t running a computer program; its hardware and software are inseparable and developed together over the course of your life. In other words, the hardware/software distinction doesn’t apply to brains.