It seems pretty obvious that Eliezer’s view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.
It is demonstrable that one’s level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.
Really? How would one demonstrate this? What does it mean for a definition to be “correct”? If something is true by definition, is it really demonstrable?
we have a moral obligation to work our hardest on this project
Really? Your plan is to get people interested in world domination by guilting them?
It seems pretty obvious that Eliezer’s view is that FAI is the quick ticket to world domination...
I hadn’t considered that, but now I see it clearly. How interesting.
Really? Your plan is to get people interested in world domination by guilting them?
Ha! If that would work, maybe it’d be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying “See that poor starving African woman? if you had listened to my plan, she’d be happier.” But I won’t be doing that.
It seems pretty obvious that Eliezer’s view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.
Really? How would one demonstrate this? What does it mean for a definition to be “correct”? If something is true by definition, is it really demonstrable?
Really? Your plan is to get people interested in world domination by guilting them?
I hadn’t considered that, but now I see it clearly. How interesting.
Ha! If that would work, maybe it’d be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying “See that poor starving African woman? if you had listened to my plan, she’d be happier.” But I won’t be doing that.