I find it hard to relate to the way of thinking of someone who’s willing to increase the chances that humanity goes extinct if someone deletes his post from a forum on the internet.
Please go find another community to “help” with this kind of blackmail.
If I understand him correctly, what he’s trying to do is to precommit to doing something which increases ER, iff EY does something that he (wfg) believes will increase ER by a greater amount. Now he may or may not be correct in that belief, but it seems clear that his motivation is to decrease net ER by disincentivizing something he views as increasing ER.
I find it hard to relate to the way of thinking of someone who’s willing to increase the chances that humanity goes extinct if someone deletes his post from a forum on the internet.
Please go find another community to “help” with this kind of blackmail.
If I understand him correctly, what he’s trying to do is to precommit to doing something which increases ER, iff EY does something that he (wfg) believes will increase ER by a greater amount. Now he may or may not be correct in that belief, but it seems clear that his motivation is to decrease net ER by disincentivizing something he views as increasing ER.
Right. Thanks for this post. People keep responding with knee-jerk reactions to the implementation rather than thought out ones to the idea :-/
Not that I can blame them, this seems to be an emotional topic for all of us.
Fair enough, go check out this article (and the wikipedia article on MAD) and see if it doesn’t make a bit more sense.