I think it’s worth distinguishing between a legal contract and setting the AI’s motivational system, even though the latter is a contract in some sense.
To restate/clarify my above comment, I agree, but think that we are likely to delegate tasks to AIs by setting their motivational system and not by drafting literal legal contracts with them. So the PAL is relevant to the extent that it works as a metaphor for setting an AIs motivational system and source code, and in this context contract enforceability isn’t an issue, and Stuart is making a mistake to be thinking about literal legal contracts (assuming that he is doing so).
Thanks for clarifying. That’s interesting and seems right if you think we won’t draft legal contracts with AI. Could you elaborate on why you think that?
Well because I think they wouldn’t be enforceable in the really bad cases the contracts would be trying to prevent :) And also by default people currently delegate tasks to computers by writing software, which I expect to continue in future (although I guess smart contracts are an interesting edge case here).
To restate/clarify my above comment, I agree, but think that we are likely to delegate tasks to AIs by setting their motivational system and not by drafting literal legal contracts with them. So the PAL is relevant to the extent that it works as a metaphor for setting an AIs motivational system and source code, and in this context contract enforceability isn’t an issue, and Stuart is making a mistake to be thinking about literal legal contracts (assuming that he is doing so).
Thanks for clarifying. That’s interesting and seems right if you think we won’t draft legal contracts with AI. Could you elaborate on why you think that?
Well because I think they wouldn’t be enforceable in the really bad cases the contracts would be trying to prevent :) And also by default people currently delegate tasks to computers by writing software, which I expect to continue in future (although I guess smart contracts are an interesting edge case here).