How badly could a reasonably intelligent follower of the selfish creed, “Maximize my QALYs”, be manhandled into some unpleasant parallel to a Pascal’s Mugging?
How many rules-of-thumb are there, which provide answers to ethical problems such as Trolley Problems, give answers that allow the user to avoid being lynched by an angry mob, and don’t require more than moderate mathematical skill to apply?
Could Maslow’s Hierarchy of Needs be used to form the basis of a multi-tiered variant of utilitarianism?
Would trying to look at ethics from an Outside View such as, say, a soft-SF rubber-forehead alien suggest any useful, novel approaches to such problems?
(I’m writing a story, and looking for inspiration to finalize a character’s ethical system, and the consequences thereof. I’m trying to stick to the rules of reality, including of sociology, so am having some trouble coming up with a set of ethics that isn’t strictly worse than the ones I know of already, and is reasonably novel to someone who’s read the Sequences. Other than this post, my next approach will be to try to work out the economic system being used, and then which virtues would allow a member to profit—somewhat unsatisfying, but probably good enough if nobody here can suggest something better. So: can you suggest something better? :) )
For story purposes, using a multi-tiered variant of utilitarianism based on social distance could lead to some interesting results. If the character were to calculate his utility function for a given being by something Calculated Utility = Utility / (Degrees of Separation from me)^2, it would be really easy to calculate, yet come close to what people really use. The interesting part from a fictional standpoint could be if your character rigidly adheres to this function, such that you can manipulate your utility in their eyes by becoming friends with their friends. (e.g. The utility for me to give a random stranger $10 is 0 (assuming infinite degrees of separation), but if they told me they were my sister’s friend, it may have a utility of $10/(2)^2, or $2.50) It could be fun to play around with the hero’s mind by manipulating the social web.
I think I once heard of a variant of this, only using degrees of kinship instead of social connections. Eg, direct offspring and full siblings are discounted to 50%, grandchildren to 25%, and so forth.
I was just struck by a thought, which could combine the two approaches, by applying some sort of probability measure to one’s acquaintances about how likely they are to become a blood relative of one’s descendants. The idea probably needs tweaking, but I don’t think I’ve come across a system quite like it before… Well, at least, not formally. It seems plausible that a number of social systems have ended up applying something like such a heuristic through informal social-evolutionary adaptation, which could provide some fodder for contrasting the Bayesian version against the historically-evolved versions.
Sounds somewhat like the ‘gay uncle’ theory, where having 4 of your siblings kids pass on their genes is equivalent to having 2 of your own pass on their genes, but with future pairings included, which is interesting.
Stephen Baxter wrote a couple of novels that explored the first theory a bit Destiny’s Children series, where gur pbybal riraghnyyl ribyirq vagb n uvir, jvgu rirelbar fhccbegvat n tebhc bs dhrraf gung gurl jrer eryngrq gb.
The addition of future contributors to the bloodline as part of your utility function could make this really interesting if set in a society that has arranged marriages and/or engagement contracts, as one arranged marriage could completely change the outcome of some deal. Though I guess this is how a ton of history played out anyway, just not quite as explicitly.
How badly could a reasonably intelligent follower of the selfish creed, “Maximize my QALYs”, be manhandled into some unpleasant parallel to a Pascal’s Mugging?
They’d be just as subject to it as anyone else. It’s just that instead of killing 3^^^3 people, they threaten to torture you for 3^^^3 years. Or offer 3^^^3 years of life or something. It comes from having an unbounded utility function. Not from any particular utility function.
Seeking plausible-but-surprising fictional ethics
How badly could a reasonably intelligent follower of the selfish creed, “Maximize my QALYs”, be manhandled into some unpleasant parallel to a Pascal’s Mugging?
How many rules-of-thumb are there, which provide answers to ethical problems such as Trolley Problems, give answers that allow the user to avoid being lynched by an angry mob, and don’t require more than moderate mathematical skill to apply?
Could Maslow’s Hierarchy of Needs be used to form the basis of a multi-tiered variant of utilitarianism?
Would trying to look at ethics from an Outside View such as, say, a soft-SF rubber-forehead alien suggest any useful, novel approaches to such problems?
(I’m writing a story, and looking for inspiration to finalize a character’s ethical system, and the consequences thereof. I’m trying to stick to the rules of reality, including of sociology, so am having some trouble coming up with a set of ethics that isn’t strictly worse than the ones I know of already, and is reasonably novel to someone who’s read the Sequences. Other than this post, my next approach will be to try to work out the economic system being used, and then which virtues would allow a member to profit—somewhat unsatisfying, but probably good enough if nobody here can suggest something better. So: can you suggest something better? :) )
For story purposes, using a multi-tiered variant of utilitarianism based on social distance could lead to some interesting results. If the character were to calculate his utility function for a given being by something Calculated Utility = Utility / (Degrees of Separation from me)^2, it would be really easy to calculate, yet come close to what people really use. The interesting part from a fictional standpoint could be if your character rigidly adheres to this function, such that you can manipulate your utility in their eyes by becoming friends with their friends. (e.g. The utility for me to give a random stranger $10 is 0 (assuming infinite degrees of separation), but if they told me they were my sister’s friend, it may have a utility of $10/(2)^2, or $2.50) It could be fun to play around with the hero’s mind by manipulating the social web.
I think I once heard of a variant of this, only using degrees of kinship instead of social connections. Eg, direct offspring and full siblings are discounted to 50%, grandchildren to 25%, and so forth.
I was just struck by a thought, which could combine the two approaches, by applying some sort of probability measure to one’s acquaintances about how likely they are to become a blood relative of one’s descendants. The idea probably needs tweaking, but I don’t think I’ve come across a system quite like it before… Well, at least, not formally. It seems plausible that a number of social systems have ended up applying something like such a heuristic through informal social-evolutionary adaptation, which could provide some fodder for contrasting the Bayesian version against the historically-evolved versions.
Anyone have any suggestions on elaborations?
Sounds somewhat like the ‘gay uncle’ theory, where having 4 of your siblings kids pass on their genes is equivalent to having 2 of your own pass on their genes, but with future pairings included, which is interesting.
Stephen Baxter wrote a couple of novels that explored the first theory a bit Destiny’s Children series, where gur pbybal riraghnyyl ribyirq vagb n uvir, jvgu rirelbar fhccbegvat n tebhc bs dhrraf gung gurl jrer eryngrq gb.
The addition of future contributors to the bloodline as part of your utility function could make this really interesting if set in a society that has arranged marriages and/or engagement contracts, as one arranged marriage could completely change the outcome of some deal. Though I guess this is how a ton of history played out anyway, just not quite as explicitly.
They’d be just as subject to it as anyone else. It’s just that instead of killing 3^^^3 people, they threaten to torture you for 3^^^3 years. Or offer 3^^^3 years of life or something. It comes from having an unbounded utility function. Not from any particular utility function.