I violently agree with all of this. Have you seen any basilisk-like ideas besides roko’s? Roko’s at least looks like a real basilisk until you think about it. Everything else I’ve seen doesn’t come close to living up to the name.
‘Violent agreement’ seems to have been adopted for use in situations in which the participants have been arguing aggressively only to discover that they agree on the substantive issues. For a term that hasn’t been hijacked as jargon I go with “vehemently”. It has a more visceral feel to it too. :)
Have you seen any basilisk-like ideas besides roko’s? Roko’s at least looks like a real basilisk until you think about it. Everything else I’ve seen doesn’t come close to living up to the name.
Roko’s is the most interesting I’ve seen too. Although for some people a combination of Pascal’s Wager and certain religious doctrines about children not being held accountable for their beliefs until a certain age would do it. Once again it is the ability to apply abstract reasoning while at the same time the naivety and weakness in following the rational conclusion correctly that would cause the problem.
Anyone have a really solid working definition for ‘basilisk’ as we use it here?
Although for some people a combination of Pascal’s Wager and certain religious doctrines about children not being held accountable for their beliefs until a certain age would do it.
Am I supposed to be able to see it from just this? Assuming it’s not the kind of thing that would hurt LW posters can you explain? Otherwise, pm it?
One interesting idea is that it seems plausible to create basilisks that only effect your memetic/cognitively different enemies- perhaps the only way to avoid the harm of the basilisk is to deconvert from your religion/ideology. A basilisk that only worked on, say, religious fundamentalists would be a really powerful weapon (I’m not suggesting that the basilisk be capable of killing anyone, necessarily).
Am I supposed to be able to see it from just this? Assuming it’s not the kind of thing that would hurt LW posters can you explain?
Pascal’s Wager → Accept popular religion (Disclaimer: I did label these people naive and with an inability to take reasoning all the way to a sane conclusion. Nevertheless, it works on some intelligent people better than on some unintelligent people.)
There exist popular religious doctrines that God will send young children to heaven regardless because they are too young to have been able to do the conversion thing. I think the “Age Of Accountability” concept may be related.
If a child is not likely to convert to the ‘True’ religion in adulthood then they are (believed to be) likely to go to Hell instead of Heaven if they grow up.
Such a child would go to Heaven if murdered while young but Hell if they grow up.
Such a child would be better off in they are murdered.
A quick google on “do young children go to hell?” led me to this, excerpt:
Therefore, we have been given a specific example in the Old Testament of an infant who died and would live forever in heaven. And Jesus Christ Himself, in the New Testament, stated that little children retain the qualities that make a person eligible to inherit the kingdom of God. We see, then, that infants and small children that die are in a safe state, and will live eternally in heaven.
With such clear statements from the Bible about the eternal destiny of dead infants and small children, why have religious people mistakenly taught that babies go to hell when they die?
(The scripture quotes in question were totally reaching by the way. But that’s the whole point of theology.)
That is over-broad unfortunately. The concept needs to be distinguished from “ugly truths” where the disutility comes from an emotional reaction to how far from ideal the world is.
Do we include concepts selected to elicit a maximal emotional response? Such as particularly Funny Jokes or things so sad that they would drive most people to suicide? They do seem to be a different concept but still deserve a similar title. (If not basilisk then at least cockatrice or one of the Gorgons).
Am I supposed to be able to see it from just this? Assuming it’s not the kind of thing that would hurt LW posters can you explain? Otherwise, pm it?
I think I know what wedrifid is getting at, but I don’t think Pascal’s Wager would do it. Pascal’s Wager argues that one should act as if one believes in God because the costs are low and the potential benefits (Heaven) are high.
But in order to get to the particular failure state at which I think wedrifid is hinting, you can’t just be betting on God—you have to be absolutely certain that Heaven exists and that its joys outweigh on every axis everything that Earth has to offer. Most people, no matter what they say, are not that certain, which is why we don’t routinely slaughter infants in order to ensure their blameless souls entry into Heaven. (Similar logic has been invoked to rationalize murders—such as innocent deaths at witch trials—but in these cases, as a justification pasted on after the fact rather than an honest motive towards murder.)
But in order to get to the particular failure state at which I think wedrifid is hinting, you can’t just be betting on God—you have to be absolutely certain that Heaven exists
No, absolute certainty is definitely not required. The cost is increased so a proportionate increase in probability*payoff is required. But this is still all dwarfed by the arbitrarily large payoffs inherent in religious questions. The whole point of ‘afterlife’ focussed doctrine is to encourage the flock to discount all ‘earthly’ matters as trivial compared to eternal questions.
Most people, no matter what they say, are not that certain, which is why we don’t routinely slaughter infants in order to ensure their blameless souls entry into Heaven.
No, that would not be a rational reason to refrain from the slaughter. The difference between 90% sure and absolutely certain isn’t really much of a big deal when you have the chance of flipping the sign bit of an arbitrarily large disulility payoff (Hell). A 0.05% hunch would be more than enough.
Rational agents that really have arbitrarily large utility payoffs floating around in their utility function will inevitably do things that look insane to us.
No, absolute certainty is definitely not required. The cost is increased so a proportionate increase in probability*payoff is required.
Right, but now you’ve left the standard formulation of Pascal’s Wager. The original Pascal’s Wager includes the stipulation that one loses nothing by behaving as if God were real. To get to a point where you’re willing to kill kids, obviously you have to go a lot further—you must be ready to incur substantial costs as a consequence of belief.
One interesting idea is that it seems plausible to create basilisks that only effect your memetic/cognitively different enemies
I am sure that this is possible, but wonder why it has not been done yet—or at least appeared on my radar. Might be one of the more darker arts, and a very interesting one!
I violently agree with all of this. Have you seen any basilisk-like ideas besides roko’s? Roko’s at least looks like a real basilisk until you think about it. Everything else I’ve seen doesn’t come close to living up to the name.
‘Violent agreement’ seems to have been adopted for use in situations in which the participants have been arguing aggressively only to discover that they agree on the substantive issues. For a term that hasn’t been hijacked as jargon I go with “vehemently”. It has a more visceral feel to it too. :)
Roko’s is the most interesting I’ve seen too. Although for some people a combination of Pascal’s Wager and certain religious doctrines about children not being held accountable for their beliefs until a certain age would do it. Once again it is the ability to apply abstract reasoning while at the same time the naivety and weakness in following the rational conclusion correctly that would cause the problem.
Anyone have a really solid working definition for ‘basilisk’ as we use it here?
Am I supposed to be able to see it from just this? Assuming it’s not the kind of thing that would hurt LW posters can you explain? Otherwise, pm it?
One interesting idea is that it seems plausible to create basilisks that only effect your memetic/cognitively different enemies- perhaps the only way to avoid the harm of the basilisk is to deconvert from your religion/ideology. A basilisk that only worked on, say, religious fundamentalists would be a really powerful weapon (I’m not suggesting that the basilisk be capable of killing anyone, necessarily).
Pascal’s Wager → Accept popular religion (Disclaimer: I did label these people naive and with an inability to take reasoning all the way to a sane conclusion. Nevertheless, it works on some intelligent people better than on some unintelligent people.)
There exist popular religious doctrines that God will send young children to heaven regardless because they are too young to have been able to do the conversion thing. I think the “Age Of Accountability” concept may be related.
If a child is not likely to convert to the ‘True’ religion in adulthood then they are (believed to be) likely to go to Hell instead of Heaven if they grow up.
Such a child would go to Heaven if murdered while young but Hell if they grow up.
Such a child would be better off in they are murdered.
Therefore...
A quick google on “do young children go to hell?” led me to this, excerpt:
(The scripture quotes in question were totally reaching by the way. But that’s the whole point of theology.)
Knowledge or concepts, the comprehension of which will cause one significant disulitly.
That is over-broad unfortunately. The concept needs to be distinguished from “ugly truths” where the disutility comes from an emotional reaction to how far from ideal the world is.
Knowledge or concepts, the comprehension of which may cause substantial damage to one’s instrumental rationality.
Do we include concepts selected to elicit a maximal emotional response? Such as particularly Funny Jokes or things so sad that they would drive most people to suicide? They do seem to be a different concept but still deserve a similar title. (If not basilisk then at least cockatrice or one of the Gorgons).
I think I know what wedrifid is getting at, but I don’t think Pascal’s Wager would do it. Pascal’s Wager argues that one should act as if one believes in God because the costs are low and the potential benefits (Heaven) are high.
But in order to get to the particular failure state at which I think wedrifid is hinting, you can’t just be betting on God—you have to be absolutely certain that Heaven exists and that its joys outweigh on every axis everything that Earth has to offer. Most people, no matter what they say, are not that certain, which is why we don’t routinely slaughter infants in order to ensure their blameless souls entry into Heaven. (Similar logic has been invoked to rationalize murders—such as innocent deaths at witch trials—but in these cases, as a justification pasted on after the fact rather than an honest motive towards murder.)
No, absolute certainty is definitely not required. The cost is increased so a proportionate increase in
probability*payoff
is required. But this is still all dwarfed by the arbitrarily large payoffs inherent in religious questions. The whole point of ‘afterlife’ focussed doctrine is to encourage the flock to discount all ‘earthly’ matters as trivial compared to eternal questions.No, that would not be a rational reason to refrain from the slaughter. The difference between 90% sure and absolutely certain isn’t really much of a big deal when you have the chance of flipping the sign bit of an arbitrarily large disulility payoff (Hell). A 0.05% hunch would be more than enough.
Rational agents that really have arbitrarily large utility payoffs floating around in their utility function will inevitably do things that look insane to us.
Right, but now you’ve left the standard formulation of Pascal’s Wager. The original Pascal’s Wager includes the stipulation that one loses nothing by behaving as if God were real. To get to a point where you’re willing to kill kids, obviously you have to go a lot further—you must be ready to incur substantial costs as a consequence of belief.
I am sure that this is possible, but wonder why it has not been done yet—or at least appeared on my radar. Might be one of the more darker arts, and a very interesting one!
Creating basilisks is hard- as evidenced by the fact that we have no recorded instance of one ever existing.
For similar reasons, good bodyguards—the kind that would take a bullet for you—are hard to find. (apologies to Magic: The Gathering)
Or: the only basilisks that are easy to create are those that directly disable the host’s ability to spread them.
Is it time to link to Monty Python yet?