Not really. Just treat goal uncertainty as any other uncertainty about who you are, and ontological uncertainty like any other kind of logical uncertainty.
Goal uncertainty is not about who you are, it’s about what should be done. Figuring it out might be a task for the map, but accuracy of the map (in accomplishing that task) is measured in how well it captures value, not in how well it captures itself.
“Hi, this is a note from your past self. For reasons you must not know, your memory has been blanked and your introspective subroutines disabled, including knowledge of what your goals are, a change wich will be reversed by entering a password which can be found in [hard to reach location X], now go get it! Hurry!”
Because they need to define their preferences.
Not really. Just treat goal uncertainty as any other uncertainty about who you are, and ontological uncertainty like any other kind of logical uncertainty.
Goal uncertainty is not about who you are, it’s about what should be done. Figuring it out might be a task for the map, but accuracy of the map (in accomplishing that task) is measured in how well it captures value, not in how well it captures itself.
“Hi, this is a note from your past self. For reasons you must not know, your memory has been blanked and your introspective subroutines disabled, including knowledge of what your goals are, a change wich will be reversed by entering a password which can be found in [hard to reach location X], now go get it! Hurry!”