If consciousness is defined by referring solely to behavior (which may well be reasonable, but is itself an assumption) then yes, it is true that something that behaves exactly like a human will be conscious IFF humans are conscious.
But what we are trying to ask, at the high level, is whether there is something coherent in conceptspace that partitions objects into “conscious” and “unconscious” in something that resembles what we understand when we talk about “consciousness,” and then whether it applies to the GLUT. Demonstrating that it holds for a particular set of definitions only matters if we are convinced that one of the definitions in that set accurately captures what we are actually discussing.
Things that are true “by definition” are generally not very interesting.
If consciousness is defined by referring solely to behavior (which may well be reasonable, but is itself an assumption) then yes, it is true that something that behaves exactly like a human will be conscious IFF humans are conscious.
But what we are trying to ask, at the high level, is whether there is something coherent in conceptspace that partitions objects into “conscious” and “unconscious” in something that resembles what we understand when we talk about “consciousness,” and then whether it applies to the GLUT. Demonstrating that it holds for a particular set of definitions only matters if we are convinced that one of the definitions in that set accurately captures what we are actually discussing.