First, welcome to Less Wrong! As you may have gathered, this is a blog devoted to the question of human rationality—techniques we can use to more accurately draw correct conclusions from what we see. If you want to introduce yourself to everyone, feel free to do so on the latest Welcome Thread.
Second, regarding the content of your post, I don’t think that’s quite what Eliezer Yudkowsky was talking about. He was talking about the sort of evidence we use right here and now to decide that we are conscious, not the sort of evidence sufficient to determine that some other thing is conscious. You can tell if an ordinary human being has a headache by hearing them say, “I’ve got a headache”, but that doesn’t mean a computer which runs speech-synthesis to emit audio recognizable as the English words “I’ve got a headache” has any such thing. It’s not supposed to be a general any-context test, and it doesn’t have to be.
Two things:
First, welcome to Less Wrong! As you may have gathered, this is a blog devoted to the question of human rationality—techniques we can use to more accurately draw correct conclusions from what we see. If you want to introduce yourself to everyone, feel free to do so on the latest Welcome Thread.
Second, regarding the content of your post, I don’t think that’s quite what Eliezer Yudkowsky was talking about. He was talking about the sort of evidence we use right here and now to decide that we are conscious, not the sort of evidence sufficient to determine that some other thing is conscious. You can tell if an ordinary human being has a headache by hearing them say, “I’ve got a headache”, but that doesn’t mean a computer which runs speech-synthesis to emit audio recognizable as the English words “I’ve got a headache” has any such thing. It’s not supposed to be a general any-context test, and it doesn’t have to be.