Monarchy is clearly the best form of government for appropriate value of variable monarch. What else is FAI rearranging matter in the light cone after all?
Monarchy is clearly the best form of government for appropriate value of variable monarch. What else is FAI rearranging matter in the light cone after all?
An entirely different form of singleton. Even presidents and dictators don’t qualify as “Monarchs” and they are a whole lot more similar to a King than an FAI is.
Where does it say your absolute ruler needs to be human? :P
Jest aside, you are right, the kind of AI’s people normally talk about when discussing FAI are sufficiently different from any human mind or even what we may intuitively imagine a mind to be, for the comparison to be grossly misleading. Talking about a ruler or supreme judge is a much worse comparison than say the old theological comparisons of YHWH to this, since ironically he was likley to be much more anthropomorphic in many respects than a FAI would be.
The statement was a bit tongue in cheek, I just wanted to point out that monarchy gets various ick feelings from us because we are mostly anti-authoritarian, but a supreme AI is the ultimate authoritarian form of government since anything it sets to do, it will do.
The reason I wanted to point to this bias was that I’ve been considering that there may be other (local) maxima on the graph of the function good (concentration of power, trustworthiness) for lesser values of concentrated power and trustworthiness.
Here’s how the discussion between Moldbug and Hanson went:
M: Decision markets won’t work well if P, and we don’t know that ~P.
H: We have data from lab and field experiments, and we always find ~P.
M: Well, induction is useless. Why should I believe ~P for a system you haven’t experimented on?
H: Well, here are some theoretical arguments suggesting ~P.
M: Oh, I don’t deny that often ~P. But how do you know that always ~P?
H: Is there any kind of evidence I could present that would convince you that ~P in the relevant cases?
M: Nope. The problem is, you’re thinking like a social scientist. You need to think like a philosopher.
H: Okay.… so what does thinking like a philosopher reveal?
M: We need a monarch.
Monarchy is clearly the best form of government for appropriate value of variable monarch. What else is FAI rearranging matter in the light cone after all?
An entirely different form of singleton. Even presidents and dictators don’t qualify as “Monarchs” and they are a whole lot more similar to a King than an FAI is.
Where does it say your absolute ruler needs to be human? :P
Jest aside, you are right, the kind of AI’s people normally talk about when discussing FAI are sufficiently different from any human mind or even what we may intuitively imagine a mind to be, for the comparison to be grossly misleading. Talking about a ruler or supreme judge is a much worse comparison than say the old theological comparisons of YHWH to this, since ironically he was likley to be much more anthropomorphic in many respects than a FAI would be.
The statement was a bit tongue in cheek, I just wanted to point out that monarchy gets various ick feelings from us because we are mostly anti-authoritarian, but a supreme AI is the ultimate authoritarian form of government since anything it sets to do, it will do.
The reason I wanted to point to this bias was that I’ve been considering that there may be other (local) maxima on the graph of the function good (concentration of power, trustworthiness) for lesser values of concentrated power and trustworthiness.
That is an amusing and not too inaccurate summary. Up voted!