Rationality is basically “how to make an accurate map of the world… and how to WIN (where win basically means getting what you “want” (where want includes all your preferences, stuff like morality, etc etc...)
Before rationality can tell you what to do, you have to tell it what it is you’re trying to do.
If your goal is to save lives, rationality can help you find ways to do that. If your goal is to turn stuff into paperclips, rationality can help you find ways to do that too.
I’m not sure I quite understand you mean by “rationally motivating” reasons.
As far as objectively compelling to any sentient (let me generalize that to any intelligent being)… Why should there be any such thing? “Doing this will help ensure your survival” “But… what if I don’t care about this?”
Rationality is basically “how to make an accurate map of the world… and how to WIN (where win basically means getting what you “want” (where want includes all your preferences, stuff like morality, etc etc...)
Before rationality can tell you what to do, you have to tell it what it is you’re trying to do.
If your goal is to save lives, rationality can help you find ways to do that. If your goal is to turn stuff into paperclips, rationality can help you find ways to do that too.
I’m not sure I quite understand you mean by “rationally motivating” reasons.
As far as objectively compelling to any sentient (let me generalize that to any intelligent being)… Why should there be any such thing? “Doing this will help ensure your survival” “But… what if I don’t care about this?”
“doing this will bring joy” “So?”
etc etc… There are No Universally Compelling Arguments