I mean like a dozen people have now had long comment threads with you about this. I doubt this one is going to cross this seemingly large inferential gap.
I think it’s still useful to ask for concise reasons for certain beliefs. “The Fundamental Question of Rationality is: “Why do you believe what you believe?”″.
Your reasons could be different from the reasons other people give, and indeed, some of your reasons seem to be different from what I’ve heard from many others.
The short answer is that from the perspective of AI it really sucks to have basically all property be owned by humans
For what it’s worth, I don’t think humans need to own basically all property in order for AIs to obey property rights. A few alternatives come to mind: humans could have a minority share of the wealth, and AIs could have property rights with each other.
I think it’s still useful to ask for concise reasons for certain beliefs. “The Fundamental Question of Rationality is: “Why do you believe what you believe?”″.
Your reasons could be different from the reasons other people give, and indeed, some of your reasons seem to be different from what I’ve heard from many others.
For what it’s worth, I don’t think humans need to own basically all property in order for AIs to obey property rights. A few alternatives come to mind: humans could have a minority share of the wealth, and AIs could have property rights with each other.