I wonder what you make out of the fact that all comments by you are upvoted above what might be expected by their content. And that critique often gets downvoted.
Do you endorse being the cherished charismatic leader? Aligning followers surely is a winning strategy.
(There are few things less likely to predispose people to downvote future comments of yours than this kind of snarky whining. I am unsure what your goal on lesswrong is but It is unlikely that this tactic is an optimal way to achieve that goal. This is intended to be a helpful explanation.)
Ithought a bit about this. I see now that independent of the truth of my observation one shouldn’t make such comments. I guess the question can’t be answered anyway due to power structure it involves if I get this rigtht from
http://lesswrong.com/lw/5wn/the_48_rules_of_power_viable/
The fact that EY is upvoted more than expected is not as pronounced as I thought after sampling a few more of his comments. And even if it’s of no consequence for the group or him in particular.
I’m sorry to have disturbed by this useless question.
Ithought a bit about this. I see now that independent of the truth of my observation one shouldn’t make such comments.
Good insight. There are exceptions but such endeavours are difficult social political moves and made even more difficult if the context suggests the reply is a retaliation. To create influence against the behaviour of or reception of a high status figure it is necessary to choose an ideal time and an unambiguous breach then all the tact you can muster.
The fact that EY is upvoted more than expected is not as pronounced as I thought after sampling a few more of his comments. And even if it’s of no consequence for the group or him in particular.
Sometimes the Eliezer votes are also more polarized. He gets downvoted heavily as well as upvoted heavily. Sometimes because people pay more attention to him but also because he really does produce quality contributions as well as tactless social blunders. On a related note I have noticed that many of my top voted comments are either criticisms of something Eliezer has done or defences of Eliezer against unjustified criticisms from others. Again I see polarization. Everything matters more when in the spotlight.
To create influence against the behaviour of or reception of a high status figure it is necessary to choose an ideal time and an unambiguous breach then all the tact you can muster.
I understand the concept of status. I also understand the relation of status and power esp. in hierarchies. The problem is that I do not accept status as a moral model of interactions among mature adults. I just clashes with my values. This hasn’t been a problem most of the time probably because I mostly wasn’t seen by others as lower status. Earlier or later this had to happen. I will adapt.
Everything matters more when in the spotlight.
I see this too. I don’t consider it unfair that he gets more karma. Well what is karma anyway. I really just wondered and wanted to ask what he thinks of it. I should’t have asked it that way. It was my status blindness.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally. But social hierarchies and http://lesswrong.com/lw/hv6/gains_from_trade_slug_versus_galaxy_how_much/ among others implies that values of high status individuals may count more.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally.
That’s an interesting question and one where I am wary of EY’s expressed beliefs on the subject. Eliezer has only ever talked about CEV as if it is something that applies to “humanity”, without specifying the aggregation mechanism but with the implication that once Extrapolation is applied the humans would in general have acceptable values. But this opens up enormous scope for the “What if most people are Dicks?” problem as well as the perverse incentive that people who breed more now can expect CEV to be more in favour of them. I’m not entirely confident that CEV wouldn’t result in a genocide or two and at best there are going to be more mild dystopic outcomes.
Optimising for CEV may sound egalitarian but on close inspection it is inherently less egalitarian than optimising for CEV. (This example should not be interpreted as an advocacy of creating an FAI with that value system.)
It could. But it needn’t. Just like we are not obliged to maximise our inclusive genetic fitness just because that was the incentives of our genes as we were evolving we need not implement a CEV based on social status just because currently human influence follows that pattern. There are two parts to this consideration: it applies both to determining the aggregation mechanism used by the CEV procedure itself and also, for a given aggregation procedure and given target CEV may compute a system based somewhat on status. This again makes us look at which group CEV is applied to. CEV will likely emphasise status to a different degree than CEV and so to the extent that those are different a group of nerds can be expected to be wary of CEV.
I wonder what you make out of the fact that all comments by you are upvoted above what might be expected by their content. And that critique often gets downvoted.
Do you endorse being the cherished charismatic leader? Aligning followers surely is a winning strategy.
(There are few things less likely to predispose people to downvote future comments of yours than this kind of snarky whining. I am unsure what your goal on lesswrong is but It is unlikely that this tactic is an optimal way to achieve that goal. This is intended to be a helpful explanation.)
Ithought a bit about this. I see now that independent of the truth of my observation one shouldn’t make such comments. I guess the question can’t be answered anyway due to power structure it involves if I get this rigtht from http://lesswrong.com/lw/5wn/the_48_rules_of_power_viable/
The fact that EY is upvoted more than expected is not as pronounced as I thought after sampling a few more of his comments. And even if it’s of no consequence for the group or him in particular.
I’m sorry to have disturbed by this useless question.
Good insight. There are exceptions but such endeavours are difficult social political moves and made even more difficult if the context suggests the reply is a retaliation. To create influence against the behaviour of or reception of a high status figure it is necessary to choose an ideal time and an unambiguous breach then all the tact you can muster.
For most part, yes.
Sometimes the Eliezer votes are also more polarized. He gets downvoted heavily as well as upvoted heavily. Sometimes because people pay more attention to him but also because he really does produce quality contributions as well as tactless social blunders. On a related note I have noticed that many of my top voted comments are either criticisms of something Eliezer has done or defences of Eliezer against unjustified criticisms from others. Again I see polarization. Everything matters more when in the spotlight.
I wonder what this means in the context of FAI and what EY wrote on coherent extrapolated volition of humanity. That might be read to imply that the values of all humans count equally. But social hierarchies and http://lesswrong.com/lw/hv6/gains_from_trade_slug_versus_galaxy_how_much/ among others implies that values of high status individuals may count more.
That’s an interesting question and one where I am wary of EY’s expressed beliefs on the subject. Eliezer has only ever talked about CEV as if it is something that applies to “humanity”, without specifying the aggregation mechanism but with the implication that once Extrapolation is applied the humans would in general have acceptable values. But this opens up enormous scope for the “What if most people are Dicks?” problem as well as the perverse incentive that people who breed more now can expect CEV to be more in favour of them. I’m not entirely confident that CEV wouldn’t result in a genocide or two and at best there are going to be more mild dystopic outcomes.
Optimising for CEV may sound egalitarian but on close inspection it is inherently less egalitarian than optimising for CEV. (This example should not be interpreted as an advocacy of creating an FAI with that value system.)
It could. But it needn’t. Just like we are not obliged to maximise our inclusive genetic fitness just because that was the incentives of our genes as we were evolving we need not implement a CEV based on social status just because currently human influence follows that pattern. There are two parts to this consideration: it applies both to determining the aggregation mechanism used by the CEV procedure itself and also, for a given aggregation procedure and given target CEV may compute a system based somewhat on status. This again makes us look at which group CEV is applied to. CEV will likely emphasise status to a different degree than CEV and so to the extent that those are different a group of nerds can be expected to be wary of CEV.