Point, it’s not a strategy for arriving at truths, it’s a snappy comeback at a failure mode I’m getting really tired of. The fact that something is in the realm of speculative fiction is not a valid argument in a world full of cyborgs, tablet computers, self driving cars, and casualty-defying decision theories. And yes, basilisks.
The argument isn’t that because something is found in speculative fiction it can’t be real; it’s that this thing you’re talking about isn’t found outside of speculative fiction—i.e. it’s not real. Science can’t do that yet. If you’re familiar with the state of a science you have a good sense of what is and isn’t possible yet. “A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote” is very likely one of those things. I mention “speculative fiction” because a lot of people have a tendency to privilege hypotheses they find in such fiction.
Hypnotism is not the same as what you’re talking about. The Roko ‘basilisk’ is joke compared to what you’re describing. None of these are anecdotal evidence for the power you are describing.
Oh, illusion of transparency. Yea, that’s at least a real argument.
There are plenty of things that individual geniuses can do that the institutions you seem to be referring to as “science” can’t yet mass produce, especially in the reference class of things like works of fiction or political species which many basilisks belong to. “Science” also believes rational agents defect on the prisoners dilemma.
Also, while proposing something like deliberate successful government suppression would be clearly falling into the conspiracy theory failure mode, it none the less does seem like an extremely dangerous weapon, that sounds absurd when described, works through badly understood psychology only present in humans, and appropriately likely to be discovered by empathic extreme high elite of intellectuals, would be less likely to become public knowledge as quickly as most things.
And I kept to small scale not-very-dangerous pseudo basilisks on purpose, just in case someone decides to look them up. They are more relevant then you think thou.
And I kept to small scale not-very-dangerous pseudo basilisks on purpose, just in case someone decides to look them up. They are more relevant then you think thou.
I don’t believe you. Look, obviously if you have secret knowledge of the existence of fatal basilisks that you’re unwilling to share that’s a good reason to have a higher credence than me. But I asked you for evidence (not even good evidence, just anecdotal evidence) and you gave me hypnotism and the silly Roko thing. Hinting that you have some deep understanding of basilisks that I don’t is explained far better by the hypothesis that you’re trying to cover for the fact that you made an embarrassingly ridiculous claim than by your actually having such an understanding. It’s okay, it was the irrationality game. You can admit you were privileging the hypothesis.
“Science” also believes rational agents defect on the prisoners dilemma.
Again, pointing to a failure of science as a justification for ignoring it when evaluating the probability of a hypothesis is a really bad thing to do. You actually have to learn things about the world in order to manipulate the world. The most talented writers in the world are capable of producing profound and significant—but nearly always temporary—emotional reactions in the small set of people that connect with them. Equating that with
A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote
is bizarre.
Also, while proposing something like deliberate successful government suppression would be clearly falling into the conspiracy theory failure mode, it none the less does seem like an extremely dangerous weapon, that sounds absurd when described, works through badly understood psychology only present in humans, and appropriately likely to be discovered by empathic extreme high elite of intellectuals, would be less likely to become public knowledge as quickly as most things.
A government possessing a basilisk and keeping it a secret is several orders of magnitude more likely than what you proposed. Governments have the funds and the will to both test and create weapons that kill. Also, “empathic” doesn’t seem like a word that describes Eliezer well.
Anyway, I don’t really think this conversation is doing anyone any good since debating absurd possibilities has the tendency to make them seem even more likely overtime as you’ll keep running your sense-making system and come up with new and better justifications for this claim until you actually begin to think “wait, two percent seems kind of low!”.
Yea, that this thread is getting WAY to adversarial for my taste, dangerously so. At least we can agree on that.
Anyway, you did admit that sometimes, rarely, a really good writer can have permanent profound emotional reactions, and I suspect most of the disagreement here actually resides in the lethality of emotional reactions, and my taste for wording things to sound dramatic as long as they are still true.
Well, I should point out that if you sincerely believe your knowledge could kill someone if it got out, you likely won’t test this belief directly. You may miss all sorts of lesser opportunities for updating. We don’t have to think you’re Stupid or you Fail As A Rationalist in order to think you got this one wrong.
It’s sort of the same situation as posting any other kind of information on how to construct weapons that may or may not work on public forums. It’s not all that likely to actually give someone access to a weapon that deadly, but it’s bad form just for the possibility, and because they may still hurt themselves or others in the failed attempt.
I were also trying to scare people away from the whole thing, but after further consideration it probably wasn’t very effective anyway.
The argument isn’t that because something is found in speculative fiction it can’t be real; it’s that this thing you’re talking about isn’t found outside of speculative fiction—i.e. it’s not real. Science can’t do that yet. If you’re familiar with the state of a science you have a good sense of what is and isn’t possible yet. “A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote” is very likely one of those things. I mention “speculative fiction” because a lot of people have a tendency to privilege hypotheses they find in such fiction.
Hypnotism is not the same as what you’re talking about. The Roko ‘basilisk’ is joke compared to what you’re describing. None of these are anecdotal evidence for the power you are describing.
Oh, illusion of transparency. Yea, that’s at least a real argument.
There are plenty of things that individual geniuses can do that the institutions you seem to be referring to as “science” can’t yet mass produce, especially in the reference class of things like works of fiction or political species which many basilisks belong to. “Science” also believes rational agents defect on the prisoners dilemma.
Also, while proposing something like deliberate successful government suppression would be clearly falling into the conspiracy theory failure mode, it none the less does seem like an extremely dangerous weapon, that sounds absurd when described, works through badly understood psychology only present in humans, and appropriately likely to be discovered by empathic extreme high elite of intellectuals, would be less likely to become public knowledge as quickly as most things.
And I kept to small scale not-very-dangerous pseudo basilisks on purpose, just in case someone decides to look them up. They are more relevant then you think thou.
I don’t believe you. Look, obviously if you have secret knowledge of the existence of fatal basilisks that you’re unwilling to share that’s a good reason to have a higher credence than me. But I asked you for evidence (not even good evidence, just anecdotal evidence) and you gave me hypnotism and the silly Roko thing. Hinting that you have some deep understanding of basilisks that I don’t is explained far better by the hypothesis that you’re trying to cover for the fact that you made an embarrassingly ridiculous claim than by your actually having such an understanding. It’s okay, it was the irrationality game. You can admit you were privileging the hypothesis.
Again, pointing to a failure of science as a justification for ignoring it when evaluating the probability of a hypothesis is a really bad thing to do. You actually have to learn things about the world in order to manipulate the world. The most talented writers in the world are capable of producing profound and significant—but nearly always temporary—emotional reactions in the small set of people that connect with them. Equating that with
is bizarre.
A government possessing a basilisk and keeping it a secret is several orders of magnitude more likely than what you proposed. Governments have the funds and the will to both test and create weapons that kill. Also, “empathic” doesn’t seem like a word that describes Eliezer well.
Anyway, I don’t really think this conversation is doing anyone any good since debating absurd possibilities has the tendency to make them seem even more likely overtime as you’ll keep running your sense-making system and come up with new and better justifications for this claim until you actually begin to think “wait, two percent seems kind of low!”.
Yea, that this thread is getting WAY to adversarial for my taste, dangerously so. At least we can agree on that.
Anyway, you did admit that sometimes, rarely, a really good writer can have permanent profound emotional reactions, and I suspect most of the disagreement here actually resides in the lethality of emotional reactions, and my taste for wording things to sound dramatic as long as they are still true.
Well, I should point out that if you sincerely believe your knowledge could kill someone if it got out, you likely won’t test this belief directly. You may miss all sorts of lesser opportunities for updating. We don’t have to think you’re Stupid or you Fail As A Rationalist in order to think you got this one wrong.
It’s sort of the same situation as posting any other kind of information on how to construct weapons that may or may not work on public forums. It’s not all that likely to actually give someone access to a weapon that deadly, but it’s bad form just for the possibility, and because they may still hurt themselves or others in the failed attempt.
I were also trying to scare people away from the whole thing, but after further consideration it probably wasn’t very effective anyway.