The agent might have a constitution such that they don’t place subjective value on changing their subjective values to something that would be more fulfillable. The current-agent would prefer that they not change their values. The hypothetical-agent would prefer that they have already changed their values. I was just reading the posts on Timeless Decision Theory and it seems like this is a problem that TDT would have a tough time grappling with.
I’m also feeling that it’s plausible that someone is systematically neg karmaing me again.
There’s an interesting issue here.
The agent might have a constitution such that they don’t place subjective value on changing their subjective values to something that would be more fulfillable. The current-agent would prefer that they not change their values. The hypothetical-agent would prefer that they have already changed their values. I was just reading the posts on Timeless Decision Theory and it seems like this is a problem that TDT would have a tough time grappling with.
I’m also feeling that it’s plausible that someone is systematically neg karmaing me again.