Eliezer suggested that, in order to avoid acting unethically, we should refrain from casually dismissing the possibility that other entities are sentient. I responded that I think that’s a very good idea and we should actually implement it. Implementing that idea means questioning assumptions that entities aren’t sentient. One tool for questioning assumptions is asking “What do you think you know, and why do you think you know it?” Or, in less binary terms, why do you assign things the probabilities that you do?
Now do you see the relevance of asking you why you believe what you do as strongly as you do, however strongly that is?
I’m not trying to “win the debate”, whatever that would entail.
Tell you what though, let me offer you a trade: If you answer my question, then I will do my best to answer a question of yours in return. Sound fair?
Eliezer suggested that, in order to avoid acting unethically, we should refrain from casually dismissing the possibility that other entities are sentient. I responded that I think that’s a very good idea and we should actually implement it. Implementing that idea means questioning assumptions that entities aren’t sentient. One tool for questioning assumptions is asking “What do you think you know, and why do you think you know it?” Or, in less binary terms, why do you assign things the probabilities that you do?
Now do you see the relevance of asking you why you believe what you do as strongly as you do, however strongly that is?
I’m not trying to “win the debate”, whatever that would entail.
Tell you what though, let me offer you a trade: If you answer my question, then I will do my best to answer a question of yours in return. Sound fair?
I’m assuming that you assign it a high probability.
I personally am assigning it a high probability only for the sake of argument.
Since I am doing it for the sake of argument, I don’t have, and need not have, any reason for doing so (other than its usefulness in argument).