The way you presented AU here makes me think of it in terms of “attachment”, as in the way we tend to get attached to outcomes that haven’t happened yet but that we expect to and then can be surprised in good and bad ways when the outcomes are better or worse than we expected. In this way impact seems tied in with our capacity to expect to see what we expect to see (meta-expectations?), e.g. I 100% expect a 40% chance of X and a 60% chance of Y happening. That 100% meta-expectation creates a kind of attachment that doesn’t leave any room for being wrong, and so just seeing something happen in a way that makes you want to update your object level expectations of X and Y after the fact seems to create a sense of impact.
The way you presented AU here makes me think of it in terms of “attachment”, as in the way we tend to get attached to outcomes that haven’t happened yet but that we expect to and then can be surprised in good and bad ways when the outcomes are better or worse than we expected. In this way impact seems tied in with our capacity to expect to see what we expect to see (meta-expectations?), e.g. I 100% expect a 40% chance of X and a 60% chance of Y happening. That 100% meta-expectation creates a kind of attachment that doesn’t leave any room for being wrong, and so just seeing something happen in a way that makes you want to update your object level expectations of X and Y after the fact seems to create a sense of impact.