Somebody mentioned Aleister Crowley’s quotes on LW a little while ago; so:
There seems to be much misunderstanding about True Will … The fact of a person being a gentleman is as much an ineluctable factor as any possible spiritual experience; in fact, it is possible, even probable, that a man may be misled by the enthusiasm of an illumination, and if he should find apparent conflict between his spiritual duty and his duty to honour, it is almost sure evidence that a trap is being laid for him and he should unhesitatingly stick to the course which ordinary decency indicates … I wish to say definitely, once and for all, that people who do not understand and accept this position have utterly failed to grasp the fundamental principles of the Law of Thelema.
-- Magical Diaries of Aleister Crowley : Tunisia 1923 (1996), edited by Stephen Skinner p.21
If one is skeptical of the existence of Thelema or of the validity of these spiritual experiences, then this sounds a lot like religious leaders who say “Sure, believe in Heaven. But don’t commit suicide to get there faster. Or commit homicide to get other people there faster. Or do anything else that contradicts ordinary decency.”
Part of the fun of being right is that when your system contradicts ordinary decency, you get to at least consider siding with your system.
(although hopefully if your system is right you will choose not to, for the right reasons.)
My Crowley background is pretty spotty, but I read that as him generalizing over ethical intersections with religious experience and then specializing to his own faith. It’s not entirely unlike some posts I’ve read here, in fact; the implication seems to be that if some consequence of your religious (i.e. axiomatic; we could substitute decision-theoretic or similarly fundamental) ethics seems to suggest gross violations of common ethics, then it’s more likely that you’ve got the wrong axioms or forgot to carry the one somewhere than that you need to run out and (e.g.) destroy all humans. Which is very much what I’d expect from a rationalist analysis of the topic.
Here is an intuition pump: you see a baby who got hold of his dad’s suitcase nuke and is about to destroy the city. Do you prevent him from pushing the button, even by lethal means? If the answer is yes, then consider Richard’s original question, and confirm that the differences in the two situations stand up to reverse your decision.
On the one hand, yes; on the other hand, I do think I take the risks from UFAI seriously, and have some relevant experience and skill, but still wouldn’t participate in a paramilitary operation against a GAI researcher.
edit: On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
Somebody mentioned Aleister Crowley’s quotes on LW a little while ago; so:
-- Magical Diaries of Aleister Crowley : Tunisia 1923 (1996), edited by Stephen Skinner p.21
If one is skeptical of the existence of Thelema or of the validity of these spiritual experiences, then this sounds a lot like religious leaders who say “Sure, believe in Heaven. But don’t commit suicide to get there faster. Or commit homicide to get other people there faster. Or do anything else that contradicts ordinary decency.”
Part of the fun of being right is that when your system contradicts ordinary decency, you get to at least consider siding with your system.
(although hopefully if your system is right you will choose not to, for the right reasons.)
My Crowley background is pretty spotty, but I read that as him generalizing over ethical intersections with religious experience and then specializing to his own faith. It’s not entirely unlike some posts I’ve read here, in fact; the implication seems to be that if some consequence of your religious (i.e. axiomatic; we could substitute decision-theoretic or similarly fundamental) ethics seems to suggest gross violations of common ethics, then it’s more likely that you’ve got the wrong axioms or forgot to carry the one somewhere than that you need to run out and (e.g.) destroy all humans. Which is very much what I’d expect from a rationalist analysis of the topic.
Extraordinary situations call for extraordinary decency
Here is an intuition pump: you see a baby who got hold of his dad’s suitcase nuke and is about to destroy the city. Do you prevent him from pushing the button, even by lethal means? If the answer is yes, then consider Richard’s original question, and confirm that the differences in the two situations stand up to reverse your decision.
On the one hand, yes; on the other hand, I do think I take the risks from UFAI seriously, and have some relevant experience and skill, but still wouldn’t participate in a paramilitary operation against a GAI researcher.
edit: On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
You mean lack of confidence, right?