I never linked complexity to absolute certainty of something being sentient or not, only to pretty good likelihood. The complexity of any known calculation+experience machine (most animals, from insect above) is undeniably way more than that of any current Turing machine. Therefore it’s reasonable to assume that consciousness demands a lot of complexity, certainly much more than that of a current language model. To generate experience is fundamentally different than to generate only calculations. Yes, this is an opinion, not a fact. But so is your claim!
I know for a fact that at least one human is consciousness (myself) because I can experience it. That’s still the strongest reason to assume it, and it can’t be called into question as you did.
That’s not correct to do either, for the same reason.
Also, I wasn’t going to mention it before (because the reasoning itself is flawed), but there is no correct way of calculating complexity that would make the complexity of an insect brain higher than LaMDA.
I never linked complexity to absolute certainty of something being sentient or not, only to pretty good likelihood. The complexity of any known calculation+experience machine (most animals, from insect above) is undeniably way more than that of any current Turing machine. Therefore it’s reasonable to assume that consciousness demands a lot of complexity, certainly much more than that of a current language model. To generate experience is fundamentally different than to generate only calculations. Yes, this is an opinion, not a fact. But so is your claim!
I know for a fact that at least one human is consciousness (myself) because I can experience it. That’s still the strongest reason to assume it, and it can’t be called into question as you did.
That’s not correct to do either, for the same reason.
Also, I wasn’t going to mention it before (because the reasoning itself is flawed), but there is no correct way of calculating complexity that would make the complexity of an insect brain higher than LaMDA.