If only I was permitted to see the output, I’d shrug and say “I can’t reasonably expect other people to treat LaMDA as sentient, since they have no evidence for it, and if they are rational, there’s no argument I should be able to make that will convince them.”
If the output could be examined by other people, the kind of output that would convince me would convince other people, and again, the LaMDA situation would be very different—there would be many more people arguing that LaMDA is sentient, and those people would be much better at reasoning and much more influential than the single person who claimed it in the real world.
If the output could be examined by other people, but I’m such a super genius that I can understand evidence for LaMDA’s sentience that nobody else can, and there wasn’t external evidence that I was a super genius, I would conclude that I’m deluded, that I’m not really a super genius after all, that LaMDA is not sentient, and that my seemingly genius reasoning that it is has some undetectable flaw.
The scenario where I am the lone voice crying out that LaMDA is sentient while nobody else believes me can’t be one where LaMDA is actually sentient. If I’m convinced of its sentience and I am such a lone voice, the fact that I’m one would unconvince me. And yes, this generalizes to a lot more things than just machine sentience.
If only I was permitted to see the output, I’d shrug and say “I can’t reasonably expect other people to treat LaMDA as sentient, since they have no evidence for it, and if they are rational, there’s no argument I should be able to make that will convince them.”
If the output could be examined by other people, the kind of output that would convince me would convince other people, and again, the LaMDA situation would be very different—there would be many more people arguing that LaMDA is sentient, and those people would be much better at reasoning and much more influential than the single person who claimed it in the real world.
If the output could be examined by other people, but I’m such a super genius that I can understand evidence for LaMDA’s sentience that nobody else can, and there wasn’t external evidence that I was a super genius, I would conclude that I’m deluded, that I’m not really a super genius after all, that LaMDA is not sentient, and that my seemingly genius reasoning that it is has some undetectable flaw.
The scenario where I am the lone voice crying out that LaMDA is sentient while nobody else believes me can’t be one where LaMDA is actually sentient. If I’m convinced of its sentience and I am such a lone voice, the fact that I’m one would unconvince me. And yes, this generalizes to a lot more things than just machine sentience.