That’s perfectly credible since it implies a lack of leverage.
Oh, I assumed that negative leverage is still leverage. Given that it might amount to an equivalent of killing a googolplex of people, assuming you equate never being born with killing.
To build an AI one must be a tad more formal than this, and once you start trying to be formal, you will soon find that you need a prior.
I see. I cannot comment on anything AI-related with any confidence. I thought we were talking about evaluating the likelihood of a certain model in physics to be accurate. In that latter case anthropic considerations seem irrelevant.
That’s perfectly credible since it implies a lack of leverage.
10^10 is not a significant factor compared to the sensory experience of seeing something float in a bathtub.
To build an AI one must be a tad more formal than this, and once you start trying to be formal, you will soon find that you need a prior.
Oh, I assumed that negative leverage is still leverage. Given that it might amount to an equivalent of killing a googolplex of people, assuming you equate never being born with killing.
I see. I cannot comment on anything AI-related with any confidence. I thought we were talking about evaluating the likelihood of a certain model in physics to be accurate. In that latter case anthropic considerations seem irrelevant.