As Tegmark argues, the idea of “final goal” for AI is likely incoherent, at least if (as he states), “Quantum effects aside, a truly well-defined goal would specify how all particles in our Universe should be arranged at the end of time.”
But “life is a journey not a destination”. So really, what we should be specifying is the entire evolution of the universe through its lifespan. So how can the universe “enjoy itself” as much as possible before the big crunch (or before and during the heat death)*.
I hypothesize that experience is related to, if not a product of, change. I further propose (counter-intuitively, and with an eye towards “refinement” (to put it mildly))** that we treat experience as inherently positive and not try to distinguish between positive and negative experiences.
Then it seems to me the (still rather intractable) question is: how does the rate of entropy’s increase relate to the quantity of experience produced? Is it simply linear (in which case, it doesn’t matter, ethically)? My intuition is that is it more like the fuel efficiency of a car, non-linear and with a sweet spot somewhere between a lengthy boredom and a flash of intensity.
*I’m not super up on cosmology; are there other theories I ought to be considering?
**One idea for refinement: successful “prediction” (undefined here) creates positive experiences; frustrated expectations negative ones.
A Somewhat Vague Proposal for Grounding Ethics in Physics
As Tegmark argues, the idea of “final goal” for AI is likely incoherent, at least if (as he states), “Quantum effects aside, a truly well-defined goal would specify how all particles in our Universe should be arranged at the end of time.”
But “life is a journey not a destination”. So really, what we should be specifying is the entire evolution of the universe through its lifespan. So how can the universe “enjoy itself” as much as possible before the big crunch (or before and during the heat death)*.
I hypothesize that experience is related to, if not a product of, change. I further propose (counter-intuitively, and with an eye towards “refinement” (to put it mildly))** that we treat experience as inherently positive and not try to distinguish between positive and negative experiences.
Then it seems to me the (still rather intractable) question is: how does the rate of entropy’s increase relate to the quantity of experience produced? Is it simply linear (in which case, it doesn’t matter, ethically)? My intuition is that is it more like the fuel efficiency of a car, non-linear and with a sweet spot somewhere between a lengthy boredom and a flash of intensity.
*I’m not super up on cosmology; are there other theories I ought to be considering?
**One idea for refinement: successful “prediction” (undefined here) creates positive experiences; frustrated expectations negative ones.