To create influence against the behaviour of or reception of a high status figure it is necessary to choose an ideal time and an unambiguous breach then all the tact you can muster.
I understand the concept of status. I also understand the relation of status and power esp. in hierarchies. The problem is that I do not accept status as a moral model of interactions among mature adults. I just clashes with my values. This hasn’t been a problem most of the time probably because I mostly wasn’t seen by others as lower status. Earlier or later this had to happen. I will adapt.
Everything matters more when in the spotlight.
I see this too. I don’t consider it unfair that he gets more karma. Well what is karma anyway. I really just wondered and wanted to ask what he thinks of it. I should’t have asked it that way. It was my status blindness.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally. But social hierarchies and http://lesswrong.com/lw/hv6/gains_from_trade_slug_versus_galaxy_how_much/ among others implies that values of high status individuals may count more.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally.
That’s an interesting question and one where I am wary of EY’s expressed beliefs on the subject. Eliezer has only ever talked about CEV as if it is something that applies to “humanity”, without specifying the aggregation mechanism but with the implication that once Extrapolation is applied the humans would in general have acceptable values. But this opens up enormous scope for the “What if most people are Dicks?” problem as well as the perverse incentive that people who breed more now can expect CEV to be more in favour of them. I’m not entirely confident that CEV wouldn’t result in a genocide or two and at best there are going to be more mild dystopic outcomes.
Optimising for CEV may sound egalitarian but on close inspection it is inherently less egalitarian than optimising for CEV. (This example should not be interpreted as an advocacy of creating an FAI with that value system.)
It could. But it needn’t. Just like we are not obliged to maximise our inclusive genetic fitness just because that was the incentives of our genes as we were evolving we need not implement a CEV based on social status just because currently human influence follows that pattern. There are two parts to this consideration: it applies both to determining the aggregation mechanism used by the CEV procedure itself and also, for a given aggregation procedure and given target CEV may compute a system based somewhat on status. This again makes us look at which group CEV is applied to. CEV will likely emphasise status to a different degree than CEV and so to the extent that those are different a group of nerds can be expected to be wary of CEV.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally. But social hierarchies and http://lesswrong.com/lw/hv6/gains_from_trade_slug_versus_galaxy_how_much/ among others implies that values of high status individuals may count more.
That’s an interesting question and one where I am wary of EY’s expressed beliefs on the subject. Eliezer has only ever talked about CEV as if it is something that applies to “humanity”, without specifying the aggregation mechanism but with the implication that once Extrapolation is applied the humans would in general have acceptable values. But this opens up enormous scope for the “What if most people are Dicks?” problem as well as the perverse incentive that people who breed more now can expect CEV to be more in favour of them. I’m not entirely confident that CEV wouldn’t result in a genocide or two and at best there are going to be more mild dystopic outcomes.
Optimising for CEV may sound egalitarian but on close inspection it is inherently less egalitarian than optimising for CEV. (This example should not be interpreted as an advocacy of creating an FAI with that value system.)
It could. But it needn’t. Just like we are not obliged to maximise our inclusive genetic fitness just because that was the incentives of our genes as we were evolving we need not implement a CEV based on social status just because currently human influence follows that pattern. There are two parts to this consideration: it applies both to determining the aggregation mechanism used by the CEV procedure itself and also, for a given aggregation procedure and given target CEV may compute a system based somewhat on status. This again makes us look at which group CEV is applied to. CEV will likely emphasise status to a different degree than CEV and so to the extent that those are different a group of nerds can be expected to be wary of CEV.