More precisely, it makes sense to concentrate on mechanisms for optimizing my environment for any given value, and security is a very important part of that.
Wouldn’t that be a bad idea? If you change your mind as to what you value, then Future!you will optimize for something Present!you doesn’t want. Since you’re only worried about Present!you’s goals, that would be bad.
Sure, if I’m only worried about Present!me’s goals, then the entire rest of the paragraph you didn’t bother quoting is of course false, and the sentence you quote for which that paragraph was intended as context is also false.
Wouldn’t that be a bad idea? If you change your mind as to what you value, then Future!you will optimize for something Present!you doesn’t want. Since you’re only worried about Present!you’s goals, that would be bad.
Sure, if I’m only worried about Present!me’s goals, then the entire rest of the paragraph you didn’t bother quoting is of course false, and the sentence you quote for which that paragraph was intended as context is also false.
Sorry. I missed a word when I read it the first time.