Perhaps you weren’t aware, but Eliezer has stated that it’s rational to not respond to threats of blackmail.
I don’t think he was talking about human beings there. Obviously you don’t want a reputation for being susceptable to being successfully blackmailed, but IMHO, maximising expected utilily results in a strategy which is not as simple as never responding to blackmail threats.
I think this is correct. Eliezer’s spoken from The Strategy of Conflict before, which goes into mathematical detail about the tradeoffs of precommitments against inconsistently rational players. The “no blackmail” thing was in regards to a rational UFAI.
I don’t think he was talking about human beings there. Obviously you don’t want a reputation for being susceptable to being successfully blackmailed, but IMHO, maximising expected utilily results in a strategy which is not as simple as never responding to blackmail threats.
I think this is correct. Eliezer’s spoken from The Strategy of Conflict before, which goes into mathematical detail about the tradeoffs of precommitments against inconsistently rational players. The “no blackmail” thing was in regards to a rational UFAI.