If so, it makes little sense to me. Math is one tool for modeling and accurately predicting the physical world, and it is surely nice to minimize the number of axioms required to construct an accurate model, but it is still about the model, there is no well-ordering and no ordinals in the physical world, these are all logical constructs. It seems that there is something in EY’s epistemology I missed.
being exposed to ordered sensory data will rapidly promote the hypothesis that induction works
...unless you are dealing with phenomena where it doesn’t, like stock markets? Or is this a statement about the general predictability of the world, i.e. that models are useful? Then it is pretty vacuous, since otherwise what point would be there in trying to model the world?
there’s some large ordinal that represents all the math you believe in.
“Believe” in what sense? That it is self-consistent? That it enables accurate modeling of physical systems?
If so, it makes little sense to me. Math is one tool for modeling and accurately predicting the physical world, and it is surely nice to minimize the number of axioms required to construct an accurate model, but it is still about the model, there is no well-ordering and no ordinals in the physical world, these are all logical constructs. It seems that there is something in EY’s epistemology I missed.
...unless you are dealing with phenomena where it doesn’t, like stock markets? Or is this a statement about the general predictability of the world, i.e. that models are useful? Then it is pretty vacuous, since otherwise what point would be there in trying to model the world?
“Believe” in what sense? That it is self-consistent? That it enables accurate modeling of physical systems?