I don’t think this is addressable because of the taboo tradeoffs in current culture around money and class. Some people produce more negative externalities than others in ways our legal system can not address, therefore people sequester themselves via money gating since that is still acceptable in practice even though it is decried explicitly.
What negative externalities are you thinking of. Maybe it’s silly for me to ask you to say, if you’re saying they’re taboo, but I’m looking over all of the elitist taboos and I don’t think any of them really raise much of an issue.
Did I mention that my prototype aggregate utility function only regards adjacency desires that are reciprocated. For instance, if a large but obnoxious fan-base all wanted to be next to a single celebrity author who mostly holds them all in contempt, the system basically ignores those connections. Mathematically, it’s like, the payoff of positioning a and b close together is min(a.desireToBeNear(b), b.desireToBeNear(a)). The default value for desireToBeNear is zero.
P.S. Does the fact that each user desire expression (roughly, the individual utility function) gets evaluated in a complex way that depends on how it relates to the other desire expressions make this not utilitarianism? Does this position that fitting our desires together will be more complex than mere addition have a name?
I don’t think this is addressable because of the taboo tradeoffs in current culture around money and class. Some people produce more negative externalities than others in ways our legal system can not address, therefore people sequester themselves via money gating since that is still acceptable in practice even though it is decried explicitly.
What negative externalities are you thinking of. Maybe it’s silly for me to ask you to say, if you’re saying they’re taboo, but I’m looking over all of the elitist taboos and I don’t think any of them really raise much of an issue.
Did I mention that my prototype aggregate utility function only regards adjacency desires that are reciprocated. For instance, if a large but obnoxious fan-base all wanted to be next to a single celebrity author who mostly holds them all in contempt, the system basically ignores those connections. Mathematically, it’s like, the payoff of positioning a and b close together is min(a.desireToBeNear(b), b.desireToBeNear(a)). The default value for desireToBeNear is zero.
P.S. Does the fact that each user desire expression (roughly, the individual utility function) gets evaluated in a complex way that depends on how it relates to the other desire expressions make this not utilitarianism? Does this position that fitting our desires together will be more complex than mere addition have a name?
https://www.fastcompany.com/90107856/urban-poverty-has-a-sound-and-its-loud
feedback loop: both contribute to the other.