I mean like a dozen people have now had long comment threads with you about this. I doubt this one is going to cross this seemingly large inferential gap.
The short answer is that from the perspective of AI it really sucks to have basically all property be owned by humans, many humans won’t be willing to sell things that AIs really want, buying things is much harder than just taking them when you have a huge strategic advantage, and doing most big things with the resources on earth while keeping it habitable is much harder than doing things while ignoring habitability.
I mean like a dozen people have now had long comment threads with you about this. I doubt this one is going to cross this seemingly large inferential gap.
I think it’s still useful to ask for concise reasons for certain beliefs. “The Fundamental Question of Rationality is: “Why do you believe what you believe?”″.
Your reasons could be different from the reasons other people give, and indeed, some of your reasons seem to be different from what I’ve heard from many others.
The short answer is that from the perspective of AI it really sucks to have basically all property be owned by humans
For what it’s worth, I don’t think humans need to own basically all property in order for AIs to obey property rights. A few alternatives come to mind: humans could have a minority share of the wealth, and AIs could have property rights with each other.
I mean like a dozen people have now had long comment threads with you about this. I doubt this one is going to cross this seemingly large inferential gap.
The short answer is that from the perspective of AI it really sucks to have basically all property be owned by humans, many humans won’t be willing to sell things that AIs really want, buying things is much harder than just taking them when you have a huge strategic advantage, and doing most big things with the resources on earth while keeping it habitable is much harder than doing things while ignoring habitability.
I think it’s still useful to ask for concise reasons for certain beliefs. “The Fundamental Question of Rationality is: “Why do you believe what you believe?”″.
Your reasons could be different from the reasons other people give, and indeed, some of your reasons seem to be different from what I’ve heard from many others.
For what it’s worth, I don’t think humans need to own basically all property in order for AIs to obey property rights. A few alternatives come to mind: humans could have a minority share of the wealth, and AIs could have property rights with each other.