These algorithms are useful maps of the brain and mind. But is computation also the territory? Is the mind a program? Such a program would need to exist as a high-level abstraction of the brain that is causally closed and fully encodes the mind.
I said it in one of your previous posts but I’ll say it again: I think causal closure is patently absurd, and a red herring. The brain is a machine that runs an algorithm, but algorithms are allowed to have inputs! And if an algorithm has inputs, then it’s not causally closed.
The most obvious examples are sensory inputs—vision, sounds, etc. I’m not sure why you don’t mention those. As soon as I open my eyes, everything in my field of view has causal effects on the flow of my brain algorithm.
Needless to say, algorithms are allowed to have inputs. For example, the mergesort algorithm has an input (namely, a list). But I hope we can all agree that mergesort is an algorithm!
The other example is: the brain algorithm has input channels where random noise enters in. Again, that doesn’t prevent it from being an algorithm. Many famous, central examples of algorithms have input channels that accept random bits—for example, MCMC.
And in regards to “practical CF”, if I run MCMC on my computer while sitting outside, and I use an anemometer attached to the computer as a source of the random input bits entering the MCMC run, then it’s true that you need an astronomically complex hyper-accurate atmospheric simulator in order to reproduce this exact run of MCMC, but I don’t understand your perspective wherein that fact would be important. It’s still true that my computer is implementing MCMC “on a level of abstraction…higher than” atoms and electrons. The wind flowing around the computer is relevant to the random bits, but is not part of the calculations that comprise MCMC (which involve the CPU instruction set etc.). By the same token, if thermal noise mildly impacts my train of thought (as it always does), then it’s true that you need to simulate my brain down to the jiggling atoms in order to reproduce this exact run of my brain algorithm, but this seems irrelevant to me, and in particular it’s still true that my brain algorithm is “implemented on a level of abstraction of the brain higher than biophysics”. (Heck, if I look up at the night sky, then you’d need to simulate the entire Milky Way to reproduce this exact run of my brain algorithm! Who cares, right?)
I said it in one of your previous posts but I’ll say it again: I think causal closure is patently absurd, and a red herring. The brain is a machine that runs an algorithm, but algorithms are allowed to have inputs! And if an algorithm has inputs, then it’s not causally closed.
The most obvious examples are sensory inputs—vision, sounds, etc. I’m not sure why you don’t mention those. As soon as I open my eyes, everything in my field of view has causal effects on the flow of my brain algorithm.
Needless to say, algorithms are allowed to have inputs. For example, the mergesort algorithm has an input (namely, a list). But I hope we can all agree that mergesort is an algorithm!
The other example is: the brain algorithm has input channels where random noise enters in. Again, that doesn’t prevent it from being an algorithm. Many famous, central examples of algorithms have input channels that accept random bits—for example, MCMC.
And in regards to “practical CF”, if I run MCMC on my computer while sitting outside, and I use an anemometer attached to the computer as a source of the random input bits entering the MCMC run, then it’s true that you need an astronomically complex hyper-accurate atmospheric simulator in order to reproduce this exact run of MCMC, but I don’t understand your perspective wherein that fact would be important. It’s still true that my computer is implementing MCMC “on a level of abstraction…higher than” atoms and electrons. The wind flowing around the computer is relevant to the random bits, but is not part of the calculations that comprise MCMC (which involve the CPU instruction set etc.). By the same token, if thermal noise mildly impacts my train of thought (as it always does), then it’s true that you need to simulate my brain down to the jiggling atoms in order to reproduce this exact run of my brain algorithm, but this seems irrelevant to me, and in particular it’s still true that my brain algorithm is “implemented on a level of abstraction of the brain higher than biophysics”. (Heck, if I look up at the night sky, then you’d need to simulate the entire Milky Way to reproduce this exact run of my brain algorithm! Who cares, right?)