There have been a lot of clever PR stunts in history.
Most of them have not been targeting smart and educated nonconformists. Eliezer successfully changed people’s mind by installing a way of thinking (a framework of heuristics, concepts and ideas) that is fine-tuned to non-obviously culminate in one inevitable conclusion, that you want to contribute money to his charity because it is rational to do so.
Most of them have not been targeting smart and educated nonconformists. Eliezer successfully changed people’s mind by installing a way of thinking (a framework of heuristics, concepts and ideas) that is fine-tuned to non-obviously culminate in one inevitable conclusion, that you want to contribute money to his charity because it is rational to do so.
Take a look at the sequences in the light of the Singularity Institute. Even the Quantum Sequence helps to hit a point home that is indispensable to convince people, who would otherwise be skeptical, that it is rational to take risks from AI seriously. The Sequences promulgate that logical implications of general beliefs you already have do not cost you extra probability and that it would be logically rude to demand some knowably unobtainable evidence.
A true masterpiece.