Not quite. It actually replaces it with the problem of maximizing people’s expected reported life satisfaction. If you wanted to choose to try heroin, this system would be able to look ahead, see that that choice will probably drastically reduce your long-term life satisfaction (more than the annoyance at the intervention), and choose to intervene and stop you.
I’m not convinced ‘what’s best for people’ with no asterisk is a coherent problem description in the first place.
Not quite. It actually replaces it with the problem of maximizing people’s expected reported life satisfaction. If you wanted to choose to try heroin, this system would be able to look ahead, see that that choice will probably drastically reduce your long-term life satisfaction (more than the annoyance at the intervention), and choose to intervene and stop you.
I’m not convinced ‘what’s best for people’ with no asterisk is a coherent problem description in the first place.
Sure, I accept the correction.
And, sure, I’m not convinced of that either.