Sorry for the allegorical language if it offended you.
I am not offended
There is a difference between not finding a solution for a problem, and not even understanding what a
solution may look like even in the abstract form.
Certainly. And further on that scale, there is “understanding so little of the problem that you’re nor even sure there’s a problem in the first place”.
Progress on the the P vs NP problem has been largely limited to determining what the solution doesn’t look like , and few if any people have any idea what it does look like, or if it (a solution) even exists (might be undecidable).
So, this scale goes
Solved problems
Unsolved problems where we have a pretty good idea what the solution looks like
Unsolved problems where we have no idea what the solution looks like : subjective experience is not here
problems we suspect exist, but can’t even define properly in the first place : subjective experience is here!
It is also not a good sign when the problem gets to be more of a mystery the more science we discover.
Consciousness and the subjective experience of pain have not gotten more mysterious the more science we discover. At worst, we understand exactly as much now as we did when we started, i.e. nothing (and neurologists would certainly argue we do understand more now).
The concern here is that we have an irrational view that rationalism is a universal tool.
I will make a point about the progress of science in this subject and then use that to step towards a more general argument for the innate mystery of consciousness with regards to reason.
Ever since the time of the enlightenment there has been a real movement in the west to view the world as purely mechanical/physical so that a conclusion of reason as a universal tool could be accepted. That meant the elimination from society of not just God but also the soul and other things.
Ironically it was a particular invention of science and reason that made rationalists realize the problem of eliminiating all non-mechanical/physical realities from a human being: the computer. With the development of the computer it became painfully obvious that human beings were fundamentally different from any designed piece of technology. Although they could theoratically design and program a coputer for all kinds of amazing functions, there is no rational model as to how to make that machine ‘conscious’. It was through computers that mankind realized in the most clear and blunt way the mystery of consciousness.
So to reassert my point....from the development of computers throughout this past century into the advancment of it to this century, the more we progress the more we understand that consciousness does not seem to be a matter of just complexity and sophistication.
Secondly, our faculty of reason itself does not even work in the same way a computer works. A computer’s mechanical structure “signals” a conclusion. The machine moves in a certain way, albeith at the tiniest levels, to signal that something is right or wrong. For us, it is understanding that makes us realize a right or wrong, it is a feeling. Even at the most fundamental level of using reason itself the mystery of consciousness is engaged and operating in a way we do not understand.
With the development of the computer it became painfully obvious that human beings were
fundamentally different from any designed piece of technology.
Evidence-based Citation needed. ( From a neurologist or computer scientist. Nothing about how our own massively parallel architecture differs from the Von Neumann architecture.)
The more we understand of the workings of the brain, the more we can mimic it on a computer. (“Ha, but these are simple tasks! Not difficult tasks like consciousness.” How convenient of you to have chosen a metric you can’t even define to judge progress towards full understanding of the human brain)
there is no rational model as to how to make that machine ‘conscious’.
And there was no such model before the development of computers either.
Your unstated assumption seems to be that it is rational to expect a quick development of a “model of consciousness” (whatever that is) after the invention of the computer. If that were so, you might have a point, but, again : evidence needed.
Secondly, our faculty of reason itself does not even work in the same way a computer works.
Evidence-based Citation needed.
Our brain runs on physics. Although there may be various as-of-yet unknown algorithms running in our brain, there is no reason to assume anything non-computational is going on.
Our brain is physical, no doubt, but as you can imagine I am making a claim that mind (consciousness, spirit, whatever you want to call it) is not the same as brain. There is a connection between the two, but my argument using rational judgment is that consciousness does not seem to be physical because there is no way to understand it rationally. Your point against me is what I use against you. You say I am mistaken because I cannot even define what is consciousness, I say that is precisely the point! The only way you can reply is to hold out for the view that consciousness may not even exist, so it may not be a problem in the first place. And that is a whole other issue, for if consciousness is only an illusion that breaks down the entire human experience of reality.
Furthermore, there are other reasons why the idea of a purely physical human being without any mysterious non-physical reality is extremely problematic:
It would mean no free will. To deny free will is to deny rationality to begin with. How can a conclusion made by reason in turn negate reason?
It would deny any real morality. Fundamdentally a human being would be the same as a piece of wood, except more complex.
It is the western insistence that reason be a univeral tool (and therefore reality be universally physical) that has led them to completely deny dualism. But if you recognize that reason itself is pointing towards its own limits, dualism is not that bad of a conclusion.
I am not offended
Certainly. And further on that scale, there is “understanding so little of the problem that you’re nor even sure there’s a problem in the first place”.
Progress on the the P vs NP problem has been largely limited to determining what the solution doesn’t look like , and few if any people have any idea what it does look like, or if it (a solution) even exists (might be undecidable).
So, this scale goes
Solved problems
Unsolved problems where we have a pretty good idea what the solution looks like
Unsolved problems where we have no idea what the solution looks like : subjective experience is not here
problems we suspect exist, but can’t even define properly in the first place : subjective experience is here!
Consciousness and the subjective experience of pain have not gotten more mysterious the more science we discover. At worst, we understand exactly as much now as we did when we started, i.e. nothing (and neurologists would certainly argue we do understand more now).
It is. Have a look at Solomonoff induction
It’s not proof, but it is evidence.
What makes you think that these problems are “in their very nature unsolvable by reason” ? Is it because you think they are inherently mysterious ?
I will make a point about the progress of science in this subject and then use that to step towards a more general argument for the innate mystery of consciousness with regards to reason.
Ever since the time of the enlightenment there has been a real movement in the west to view the world as purely mechanical/physical so that a conclusion of reason as a universal tool could be accepted. That meant the elimination from society of not just God but also the soul and other things.
Ironically it was a particular invention of science and reason that made rationalists realize the problem of eliminiating all non-mechanical/physical realities from a human being: the computer. With the development of the computer it became painfully obvious that human beings were fundamentally different from any designed piece of technology. Although they could theoratically design and program a coputer for all kinds of amazing functions, there is no rational model as to how to make that machine ‘conscious’. It was through computers that mankind realized in the most clear and blunt way the mystery of consciousness.
So to reassert my point....from the development of computers throughout this past century into the advancment of it to this century, the more we progress the more we understand that consciousness does not seem to be a matter of just complexity and sophistication.
Secondly, our faculty of reason itself does not even work in the same way a computer works. A computer’s mechanical structure “signals” a conclusion. The machine moves in a certain way, albeith at the tiniest levels, to signal that something is right or wrong. For us, it is understanding that makes us realize a right or wrong, it is a feeling. Even at the most fundamental level of using reason itself the mystery of consciousness is engaged and operating in a way we do not understand.
Evidence-based Citation needed. ( From a neurologist or computer scientist. Nothing about how our own massively parallel architecture differs from the Von Neumann architecture.)
The more we understand of the workings of the brain, the more we can mimic it on a computer. (“Ha, but these are simple tasks! Not difficult tasks like consciousness.” How convenient of you to have chosen a metric you can’t even define to judge progress towards full understanding of the human brain)
And there was no such model before the development of computers either.
Your unstated assumption seems to be that it is rational to expect a quick development of a “model of consciousness” (whatever that is) after the invention of the computer. If that were so, you might have a point, but, again : evidence needed.
Evidence-based Citation needed.
Our brain runs on physics. Although there may be various as-of-yet unknown algorithms running in our brain, there is no reason to assume anything non-computational is going on.
Will you change your mind if/when whole brain emulation becomes feasible ?
Our brain is physical, no doubt, but as you can imagine I am making a claim that mind (consciousness, spirit, whatever you want to call it) is not the same as brain. There is a connection between the two, but my argument using rational judgment is that consciousness does not seem to be physical because there is no way to understand it rationally. Your point against me is what I use against you. You say I am mistaken because I cannot even define what is consciousness, I say that is precisely the point! The only way you can reply is to hold out for the view that consciousness may not even exist, so it may not be a problem in the first place. And that is a whole other issue, for if consciousness is only an illusion that breaks down the entire human experience of reality.
Furthermore, there are other reasons why the idea of a purely physical human being without any mysterious non-physical reality is extremely problematic:
It would mean no free will. To deny free will is to deny rationality to begin with. How can a conclusion made by reason in turn negate reason?
It would deny any real morality. Fundamdentally a human being would be the same as a piece of wood, except more complex.
It is the western insistence that reason be a univeral tool (and therefore reality be universally physical) that has led them to completely deny dualism. But if you recognize that reason itself is pointing towards its own limits, dualism is not that bad of a conclusion.