I don’t think you can escape stories entirely. I would claim that as soon as you summarize your facts or data, the mere selection of which facts to present or summarize is the crafting of a story. Even dumping all your data and every observation is likely to be biased by which data you collected and what you paid attention to. What you thought were the relevant things to report to another person.
It’s tempting to want to claim direct knowledge of things, even if you are an empiricist, because it would provide grounding of your observations in facts, but the reality seems to be that everything is mediated by sensory experience at the least (not to mention other ways in which experience is mediated in things as complex as humans), so we are always stuck with at least the stories that our sensory organs enable (for example, your experience of pressure waves in the air as sound). I’d say it goes even deeper than that, being a fundamental consequence of information transfer via the intentional relationship between subject and object, but we probably don’t need to move beyond a pragmatic level in the current discussion.
This is also why I worry, in the context of AI alignment, that Goodharting cannot be eliminated (though maybe we can mitigate it enough to not matter): representationalism (indirect realism) creates the seed of all misalignment between reality and the measurement of it, so we will always be in active effort to work against a gradient that seeks to pull us down towards divergence.
I think we can say something stronger than this: we can’t escape stories at all, because stories seem to be another way of talking about ontology (maps), and we literally can’t talk about anything without framing it within some ontology.
It’s tempting to want to claim direct knowledge of things, even if you are an empiricist, because it would provide grounding of your observations in facts, but the reality seems to be that everything is mediated by sensory experience at the least (not to mention other ways in which experience is mediated in things as complex as humans), so we are always stuck with at least the stories that our sensory organs enable (for example, your experience of pressure waves in the air as sound). I’d say it goes even deeper than that, being a fundamental consequence of information transfer via the intentional relationship between subject and object, but we probably don’t need to move beyond a pragmatic level in the current discussion.
This is also why I worry, in the context of AI alignment, that Goodharting cannot be eliminated (though maybe we can mitigate it enough to not matter): representationalism (indirect realism) creates the seed of all misalignment between reality and the measurement of it, so we will always be in active effort to work against a gradient that seeks to pull us down towards divergence.