Some people are in favor of creating a singleton AI to rule the universe. [...]
Republicans and Democrats would both want a singleton to take care of them; just in different ways. Those who don’t want a singleton at all would be Libertarians.
This quote (especially the last sentence) shows total lack of understanding of what “ruling the universe” by AI is about. It’s not about tribal chiefs and liberty, no. You won’t find a technical description of how things should be in the history books—only crazy dogma and clumsy rules of thumb.
I think that Phil was talking about what Americans in general would say that they want from an AI. It should go without saying that what they would say would reveal fundamental misunderstandings of what they were talking about.
There is a distinction between normative “what these people should choose” and descriptive “what would be these people’s (uninformed) choice”. I suspect Phil is taking both to give the same result in this post. (Phil?)
Neither of those alternatives makes any sense to me. An uninformed choice would be a random choice. What they “should” choose sounds like what they would choose if they were much better reasoners than they are.
What sense did you mean to describe Democrats/libertarians/etc. choice about a singleton? Actual informed choice? Does this choice disagree with what you think they should choose (in the sense of what they would choose if they were smarter and more informed than they can ever actually be, but with the same values, which are not necessarily equal to yours)?
An uninformed choice would be a random choice.
Nothing is random in this sense. A choice made with limited ability to reason, even if “informed”, still involves a fair amount of noise. Not being reliably informed of what the question actually means is but a grade of the same problem. Thus, I meant “uniformed choice” in the sense of not being reliably the correct choice, for one reason or another.
Well, I do have a total lack of understanding of what your last sentence is supposed to communicate. And I’m pretty sure that by “total lack of understanding” you mean “failure to agree with my personal, vague, never-fully-communicated ideas.”
This quote (especially the last sentence) shows total lack of understanding of what “ruling the universe” by AI is about. It’s not about tribal chiefs and liberty, no. You won’t find a technical description of how things should be in the history books—only crazy dogma and clumsy rules of thumb.
I think that Phil was talking about what Americans in general would say that they want from an AI. It should go without saying that what they would say would reveal fundamental misunderstandings of what they were talking about.
It seems not.
On the contrary, it would seem so.
There is a distinction between normative “what these people should choose” and descriptive “what would be these people’s (uninformed) choice”. I suspect Phil is taking both to give the same result in this post. (Phil?)
Neither of those alternatives makes any sense to me. An uninformed choice would be a random choice. What they “should” choose sounds like what they would choose if they were much better reasoners than they are.
What sense did you mean to describe Democrats/libertarians/etc. choice about a singleton? Actual informed choice? Does this choice disagree with what you think they should choose (in the sense of what they would choose if they were smarter and more informed than they can ever actually be, but with the same values, which are not necessarily equal to yours)?
Nothing is random in this sense. A choice made with limited ability to reason, even if “informed”, still involves a fair amount of noise. Not being reliably informed of what the question actually means is but a grade of the same problem. Thus, I meant “uniformed choice” in the sense of not being reliably the correct choice, for one reason or another.
Well, I do have a total lack of understanding of what your last sentence is supposed to communicate. And I’m pretty sure that by “total lack of understanding” you mean “failure to agree with my personal, vague, never-fully-communicated ideas.”