Somebody mentioned Aleister Crowley’s quotes on LW a little while ago; so:
There seems to be much misunderstanding about True Will … The fact of a person being a gentleman is as much an ineluctable factor as any possible spiritual experience; in fact, it is possible, even probable, that a man may be misled by the enthusiasm of an illumination, and if he should find apparent conflict between his spiritual duty and his duty to honour, it is almost sure evidence that a trap is being laid for him and he should unhesitatingly stick to the course which ordinary decency indicates … I wish to say definitely, once and for all, that people who do not understand and accept this position have utterly failed to grasp the fundamental principles of the Law of Thelema.
-- Magical Diaries of Aleister Crowley : Tunisia 1923 (1996), edited by Stephen Skinner p.21
If one is skeptical of the existence of Thelema or of the validity of these spiritual experiences, then this sounds a lot like religious leaders who say “Sure, believe in Heaven. But don’t commit suicide to get there faster. Or commit homicide to get other people there faster. Or do anything else that contradicts ordinary decency.”
Part of the fun of being right is that when your system contradicts ordinary decency, you get to at least consider siding with your system.
(although hopefully if your system is right you will choose not to, for the right reasons.)
My Crowley background is pretty spotty, but I read that as him generalizing over ethical intersections with religious experience and then specializing to his own faith. It’s not entirely unlike some posts I’ve read here, in fact; the implication seems to be that if some consequence of your religious (i.e. axiomatic; we could substitute decision-theoretic or similarly fundamental) ethics seems to suggest gross violations of common ethics, then it’s more likely that you’ve got the wrong axioms or forgot to carry the one somewhere than that you need to run out and (e.g.) destroy all humans. Which is very much what I’d expect from a rationalist analysis of the topic.
Here is an intuition pump: you see a baby who got hold of his dad’s suitcase nuke and is about to destroy the city. Do you prevent him from pushing the button, even by lethal means? If the answer is yes, then consider Richard’s original question, and confirm that the differences in the two situations stand up to reverse your decision.
On the one hand, yes; on the other hand, I do think I take the risks from UFAI seriously, and have some relevant experience and skill, but still wouldn’t participate in a paramilitary operation against a GAI researcher.
edit: On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
There is a problem that can occur when you are attempting to check all of your biases when contemplating a serious crime.
The risk is, while checking your biases you are exposing yourself to people who would then have the ability to help law enforcement turn you in for that serious crime. And you would presumably be aware of the fact that you can hardly let others capture you, because then you would know that there would be other things that you didn’t blow up as part of your plan to save the world because you weren’t secretive enough.
This means that by checking all of your biases you are boosting the chance of the world be destroyed if it turns out you weren’t biased. And it’s easy to convince yourself that you can’t risk that, so you can’t talk to other people about your plans.
But you can’t thoroughly check your biases by consulting yourself and no one else. It is entirely possible for you to be heavily deluding yourself, having gotten brain damage or gone insane.
So you’re left with the conflicting demands of “I need to talk with other people to verify this is accurate.” and “I need to keep this a secret, so I can implement it if it is accurate.”
As a side question, does it feel like this has a few points that are oddly similar to Pascal’s mugging to anyone else?
As an example, they both seem to have that aspect of “But you simply MUST do this, because the consequences are simply to great not do it, even after accounting for the probabilities?”
That’s not true about the confidentiality of priests… a priest has the same legal obligation to turn in someone that is a danger to themselves or others as a therapist.
Can. 983 §1. The sacramental seal is inviolable; therefore it is absolutely forbidden for a confessor to betray in any way a penitent in words or in any manner and for any reason.
Can. 1388 §1. A confessor who directly violates the sacramental seal incurs a latae sententiae excommunication reserved to the Apostolic See; one who does so only indirectly is to be punished according to the gravity of the delict.
If you are convinced that, barring any biases, your calculated course of action is the right one, you could talk to anyone you trusted to be similarly convinced by your arguments. Either they will point out your errors and convince you that you shouldn’t act, or they will not discover any errors and agree to help you with your plans.
If you really believe it, and compensated for biases by all means available and you are a good consequentialist, … fat man .. 5 workers …
I hear SIAI was looking for martial arts skilled people, lol.
Somebody mentioned Aleister Crowley’s quotes on LW a little while ago; so:
-- Magical Diaries of Aleister Crowley : Tunisia 1923 (1996), edited by Stephen Skinner p.21
If one is skeptical of the existence of Thelema or of the validity of these spiritual experiences, then this sounds a lot like religious leaders who say “Sure, believe in Heaven. But don’t commit suicide to get there faster. Or commit homicide to get other people there faster. Or do anything else that contradicts ordinary decency.”
Part of the fun of being right is that when your system contradicts ordinary decency, you get to at least consider siding with your system.
(although hopefully if your system is right you will choose not to, for the right reasons.)
My Crowley background is pretty spotty, but I read that as him generalizing over ethical intersections with religious experience and then specializing to his own faith. It’s not entirely unlike some posts I’ve read here, in fact; the implication seems to be that if some consequence of your religious (i.e. axiomatic; we could substitute decision-theoretic or similarly fundamental) ethics seems to suggest gross violations of common ethics, then it’s more likely that you’ve got the wrong axioms or forgot to carry the one somewhere than that you need to run out and (e.g.) destroy all humans. Which is very much what I’d expect from a rationalist analysis of the topic.
Extraordinary situations call for extraordinary decency
Here is an intuition pump: you see a baby who got hold of his dad’s suitcase nuke and is about to destroy the city. Do you prevent him from pushing the button, even by lethal means? If the answer is yes, then consider Richard’s original question, and confirm that the differences in the two situations stand up to reverse your decision.
On the one hand, yes; on the other hand, I do think I take the risks from UFAI seriously, and have some relevant experience and skill, but still wouldn’t participate in a paramilitary operation against a GAI researcher.
edit: On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.
You mean lack of confidence, right?
There is a problem that can occur when you are attempting to check all of your biases when contemplating a serious crime.
The risk is, while checking your biases you are exposing yourself to people who would then have the ability to help law enforcement turn you in for that serious crime. And you would presumably be aware of the fact that you can hardly let others capture you, because then you would know that there would be other things that you didn’t blow up as part of your plan to save the world because you weren’t secretive enough.
This means that by checking all of your biases you are boosting the chance of the world be destroyed if it turns out you weren’t biased. And it’s easy to convince yourself that you can’t risk that, so you can’t talk to other people about your plans.
But you can’t thoroughly check your biases by consulting yourself and no one else. It is entirely possible for you to be heavily deluding yourself, having gotten brain damage or gone insane.
So you’re left with the conflicting demands of “I need to talk with other people to verify this is accurate.” and “I need to keep this a secret, so I can implement it if it is accurate.”
As a side question, does it feel like this has a few points that are oddly similar to Pascal’s mugging to anyone else?
As an example, they both seem to have that aspect of “But you simply MUST do this, because the consequences are simply to great not do it, even after accounting for the probabilities?”
A catholic priest couldn’t turn you in, and a smart one probably knows a lot about some kinds of human biases.
That’s not true about the confidentiality of priests… a priest has the same legal obligation to turn in someone that is a danger to themselves or others as a therapist.
Doubt it. The Code of Canon Law states:
If you are convinced that, barring any biases, your calculated course of action is the right one, you could talk to anyone you trusted to be similarly convinced by your arguments. Either they will point out your errors and convince you that you shouldn’t act, or they will not discover any errors and agree to help you with your plans.