I still don’t get why people have to use all these indirect abstractions like measure rather than just thinking in ambient control on the multiverse directly.
Not really. Just treat goal uncertainty as any other uncertainty about who you are, and ontological uncertainty like any other kind of logical uncertainty.
Goal uncertainty is not about who you are, it’s about what should be done. Figuring it out might be a task for the map, but accuracy of the map (in accomplishing that task) is measured in how well it captures value, not in how well it captures itself.
“Hi, this is a note from your past self. For reasons you must not know, your memory has been blanked and your introspective subroutines disabled, including knowledge of what your goals are, a change wich will be reversed by entering a password which can be found in [hard to reach location X], now go get it! Hurry!”
I still don’t get why people have to use all these indirect abstractions like measure rather than just thinking in ambient control on the multiverse directly.
Because they need to define their preferences.
Not really. Just treat goal uncertainty as any other uncertainty about who you are, and ontological uncertainty like any other kind of logical uncertainty.
Goal uncertainty is not about who you are, it’s about what should be done. Figuring it out might be a task for the map, but accuracy of the map (in accomplishing that task) is measured in how well it captures value, not in how well it captures itself.
“Hi, this is a note from your past self. For reasons you must not know, your memory has been blanked and your introspective subroutines disabled, including knowledge of what your goals are, a change wich will be reversed by entering a password which can be found in [hard to reach location X], now go get it! Hurry!”