There is an ACX article on “trapped priors”, which in the Ayn Rand analogy would be… uhm, dunno.
The idea is that a subagent can make a self-fulfilling prophecy like “if you do X, you will feel really bad”. You use some willpower to make yourself do X, but the subagent keeps screaming at you “now you will feel bad! bad!! bad!!!” and the screaming ultimately makes you feel bad. Then the subagent says “I told you so” and collects the money.
The business analogy could be betting on company internal prediction market, where some employees figure out that they can bet on their own work ending up bad, and then sabotage it and collect the money. And you can’t fire them, because HR does not allow you to fire your “best” employees (where “best” is operationalized as “making excellent predictions on the internal prediction market”).
There is an ACX article on “trapped priors”, which in the Ayn Rand analogy would be… uhm, dunno.
The idea is that a subagent can make a self-fulfilling prophecy like “if you do X, you will feel really bad”. You use some willpower to make yourself do X, but the subagent keeps screaming at you “now you will feel bad! bad!! bad!!!” and the screaming ultimately makes you feel bad. Then the subagent says “I told you so” and collects the money.
The business analogy could be betting on company internal prediction market, where some employees figure out that they can bet on their own work ending up bad, and then sabotage it and collect the money. And you can’t fire them, because HR does not allow you to fire your “best” employees (where “best” is operationalized as “making excellent predictions on the internal prediction market”).