Thus Being Secure is > Working to be secure > Not being secure > being secure.
As judged at different times, under different circumstances (having less or more money, being less or more burned out). This doesn’t sound like a “real” intransitive preference.
whatever simulation the fAI decides on for post-singularity humanity, I think I’d rather be free of it to fuck up my own life. Me and many others.… Why should we trust an AI that maximizes human utility, even if it understands what that means?
But then, your freedom is a factor in deciding what’s best for you. It sounds like you’re thinking of an FAI as a well-intentioned but extremely arrogant human, who can’t resist the temptation to meddle where it rationally shouldn’t.
As judged at different times, under different circumstances (having less or more money, being less or more burned out). This doesn’t sound like a “real” intransitive preference.
But then, your freedom is a factor in deciding what’s best for you. It sounds like you’re thinking of an FAI as a well-intentioned but extremely arrogant human, who can’t resist the temptation to meddle where it rationally shouldn’t.