I sort of dismiss the entire argument here because based on my understanding, the brain determines the best possible outcomes given a set of beliefs (aka experiences) and based on some boolean logic based on sense of self, others, and reality, the result in actions will be derived from quantum wave function collapse given the belief set, current stimulus, and possible actions. I’m not trying to prove why I believe they are quantum here, except to say, to think otherwise is just saying quantum effects are not part of nature and not part of evolution. And that seems to be what would need to be proven given how efficient evolution is and how electrical and non-centered our brains are. So determining how many transistors would be needed and how much computational depth is needed is sort of moot if we are going to assume a Newtonian brain since I think we’re solving for the wrong problem. AI will also beat us with normal cognition, but emotion and valence only come with the experiences we have and the beliefs we have about them and that will be the problem with AI until we approach the problem correctly.
I sort of dismiss the entire argument here because based on my understanding, the brain determines the best possible outcomes given a set of beliefs (aka experiences) and based on some boolean logic based on sense of self, others, and reality, the result in actions will be derived from quantum wave function collapse given the belief set, current stimulus, and possible actions. I’m not trying to prove why I believe they are quantum here, except to say, to think otherwise is just saying quantum effects are not part of nature and not part of evolution. And that seems to be what would need to be proven given how efficient evolution is and how electrical and non-centered our brains are. So determining how many transistors would be needed and how much computational depth is needed is sort of moot if we are going to assume a Newtonian brain since I think we’re solving for the wrong problem. AI will also beat us with normal cognition, but emotion and valence only come with the experiences we have and the beliefs we have about them and that will be the problem with AI until we approach the problem correctly.