I think the OP’s overarching concern is something like a narrow utilitarianism whose decision algorithm takes EV over only a limited number of horizons and decision sizes. There is unknown EV in exploring the world more personally and in reproducing knowledge and skills. My hunch is that such optimization of human life takes these different aspects at least multiplicatively.
Expected value calculations have limits for decisions which will affect your worldview, i.e. exploration. Or decisions along the axis of goods which you don’t have a good model for, i.e. education.
I think the OP’s overarching concern is something like a narrow utilitarianism whose decision algorithm takes EV over only a limited number of horizons and decision sizes. There is unknown EV in exploring the world more personally and in reproducing knowledge and skills. My hunch is that such optimization of human life takes these different aspects at least multiplicatively.
Expected value calculations have limits for decisions which will affect your worldview, i.e. exploration. Or decisions along the axis of goods which you don’t have a good model for, i.e. education.