(This is a basic intuition pump I’ve found helpful in making decisions, and maybe you’ll like it too.)
For all its shortcomings, I think there was something quite useful about the “What Would Jesus Do?” meme within the Christian framework. Of course it’s not a very sophisticated ethical guide, and it comes with all kinds of biases; but asking it does put the believer into a frame of mind that emphasizes things like compassion and duty, and it sometimes helps the believer generate options that weren’t in their default solution space.
Is there a version of this handy tool for the consequentialist, with our muddled mixture of selfish and altruistic goals and impulses, and the added difficulty that we’re looking to actually optimize rather hard?
The one that works best for me is a double roleplay.
Jernau Gurgeh, champion of strategic games across the galaxy, sits down to a nice futuristic immersive roleplaying game: The Orthonormal Experience. Gurgeh will be controlling a denizen of the early 21st century on Earth, someone with the online name of Orthonormal. Getting Orthonormal to do well by Orthonormal’s own standards is Gurgeh’s objective.
Just as our roleplaying games have game masters who can call out uncharacteristic plans, so too does Gurgeh’s game. He can’t simply use his superior vantage point to calculate the right stocks for Orthonormal to buy today and sell tomorrow, because Orthonormal couldn’t do that except by luck. He can’t even have Orthonormal think at peak performance on some days- there are character attributes (penalties like Anxiety Disorder) he has to play around.
But Gurgeh is able to think patiently, and strategically, about the various obstacles blocking Orthonormal’s progress, and to guide Orthonormal’s thoughts in plausible ways to work on these. There’s a lot of points out there to be scored: better states to reach in Orthonormal’s relationships, career, inner life, and more.
What would Gurgeh do?
One last note: a couple of the bugs with this approach can be confronted within the approach itself. If I decide that Gurgeh might do X, and I try X and fail, it can be tempting to get frustrated with myself. But this isn’t what Gurgeh would do next! He’d take my failure as more data about what this character’s current attributes are, and look for ways to work around that failure mode or to train the relevant attribute. And he’d probably give the character a short rest to recover mana before trying again.
Roleplaying As Yourself
(This is a basic intuition pump I’ve found helpful in making decisions, and maybe you’ll like it too.)
For all its shortcomings, I think there was something quite useful about the “What Would Jesus Do?” meme within the Christian framework. Of course it’s not a very sophisticated ethical guide, and it comes with all kinds of biases; but asking it does put the believer into a frame of mind that emphasizes things like compassion and duty, and it sometimes helps the believer generate options that weren’t in their default solution space.
Is there a version of this handy tool for the consequentialist, with our muddled mixture of selfish and altruistic goals and impulses, and the added difficulty that we’re looking to actually optimize rather hard?
The one that works best for me is a double roleplay.
Jernau Gurgeh, champion of strategic games across the galaxy, sits down to a nice futuristic immersive roleplaying game: The Orthonormal Experience. Gurgeh will be controlling a denizen of the early 21st century on Earth, someone with the online name of Orthonormal. Getting Orthonormal to do well by Orthonormal’s own standards is Gurgeh’s objective.
Just as our roleplaying games have game masters who can call out uncharacteristic plans, so too does Gurgeh’s game. He can’t simply use his superior vantage point to calculate the right stocks for Orthonormal to buy today and sell tomorrow, because Orthonormal couldn’t do that except by luck. He can’t even have Orthonormal think at peak performance on some days- there are character attributes (penalties like Anxiety Disorder) he has to play around.
But Gurgeh is able to think patiently, and strategically, about the various obstacles blocking Orthonormal’s progress, and to guide Orthonormal’s thoughts in plausible ways to work on these. There’s a lot of points out there to be scored: better states to reach in Orthonormal’s relationships, career, inner life, and more.
What would Gurgeh do?
One last note: a couple of the bugs with this approach can be confronted within the approach itself. If I decide that Gurgeh might do X, and I try X and fail, it can be tempting to get frustrated with myself. But this isn’t what Gurgeh would do next! He’d take my failure as more data about what this character’s current attributes are, and look for ways to work around that failure mode or to train the relevant attribute. And he’d probably give the character a short rest to recover mana before trying again.