I don’t think you can give me a moment of pleasure that intense without using 3^^^3 worth of atoms on which to run my brain, and I think the leverage penalty still applies then. You definitely can’t give me a moment of worthwhile happiness that intense without 3^^^3 units of background computation.
The article said the leverage penalty “[penalizes] hypotheses that let you affect a large number of people, in proportion to the number of people affected.” If this is all the leverage penalty does, then it doesn’t matter if it takes 3^^^3 atoms or units of computation, because atoms and computations aren’t people.
That said, the article doesn’t precisely define what the leverage penalty is, so there could be something I’m missing. So, what exactly is the leverage penalty? Does it penalize how many units of computation, rather than people, you can affect? This sounds much less arbitrary than the vague definition of “person” and sounds much easier to define: simply divide the prior of a hypothesis by the number of bits flipped by your actions in it and then normalize.
can even strip out the part about agents and carry out the reasoning on pure causal nodes; the chance of a randomly selected causal node being in a unique100 position on a causal graph with respect to 3↑↑↑3 other nodes ought to be at most 100/3↑↑↑3 for finite causal graphs.
I don’t think you can give me a moment of pleasure that intense without using 3^^^3 worth of atoms on which to run my brain, and I think the leverage penalty still applies then. You definitely can’t give me a moment of worthwhile happiness that intense without 3^^^3 units of background computation.
The article said the leverage penalty “[penalizes] hypotheses that let you affect a large number of people, in proportion to the number of people affected.” If this is all the leverage penalty does, then it doesn’t matter if it takes 3^^^3 atoms or units of computation, because atoms and computations aren’t people.
That said, the article doesn’t precisely define what the leverage penalty is, so there could be something I’m missing. So, what exactly is the leverage penalty? Does it penalize how many units of computation, rather than people, you can affect? This sounds much less arbitrary than the vague definition of “person” and sounds much easier to define: simply divide the prior of a hypothesis by the number of bits flipped by your actions in it and then normalize.
You’re absolutely right. I’m not sure how I missed or forgot about reading that.