would that information significantly affect your current confidence levels about what you believe?
Yes. In the absence of actual evidence (which seems dangerous to gather in the case of this basilisk), I pretty much have to go by expressed opinions. To my mind, it was like trying to count the results of experiments that haven’t been performed yet.
I did not seek out more information because it was a throwaway line in an argument attempting to explain to people why it appears their voices are being ignored. I personally am on the side of censoring the idea, not having understood it at all when it first posted, and that may have bled into my posts (I should have exercised stronger control over that) but I am not arguing for censorship. I am arguing why, when someone says “it’s not dangerous!”, some people aren’t coming around to their perspective.
I don’t intend to argue for the censorship of the idea unless sorely pressed.
** I’m confused. On the one hand, you say knowing the popularity of various positions is important to you in deciding your own beliefs about something potentially dangerous to you and others. On the other hand, you say it’s not worth seeking more information about and was just a throwaway line in an argument. I am having a hard time reconciling those two claims… you seem to be trying to have it both ways. I suspect I’ve misunderstood something important.
** I didn’t think you were arguing for censorship. Or against it. Actually, I have long since lost track of what most participants in this thread are arguing for, and in some cases I’m not sure they themselves know.
** I agree with you that the existence of knowledgeable people who think something is dangerous is evidence that it’s dangerous.
** Since it seems to matter: for my own part, I rate the expected dangerousness of “the basilisk” very low, and the social cost to the group of the dispute over “censoring” it significantly higher but still low.
** I cannot see why that should be of any evidentiary value whatsoever, to you or anyone else. Whether I’m right or wrong, my position is a pretty easy-to-reach one; it’s the one you arrive at in the absence of other salient beliefs (like, for example, the belief that EY/SIAI is a highly reliable estimator of potential harm done by “basilisks” in general, or the belief that the specific argument for the harmfulness of this basilisk is compelling). And most newcomers will lack those other beliefs. So I expect that quite a few people share my position—far more than 50% -- but I can’t see why you ought to find that fact compelling. That a belief is very widely shared among many many people like me who don’t know much about the topic isn’t much evidence for anything.
(nods) I’m a great believer in it. Especially in cases where a disagreement has picked up momentum, and recognizable factions have started forming… for example, if people start suggesting that those who side with the other team should leave the group. My confidence in my ability to evaluate an argument honestly goes up when I genuinely don’t know what team that argument is playing for.
I suspect I’ve obfuscated it, actually. The popularity of various positions is not intrinsically important to me—in fact, I give professions of believe about as little credit as I can get away with. This specific case is such that every form of evidence I find stronger (reasoning through the argument logically for flaws; statistical evidence about its danger) is not available. With a dearth of stronger evidence, I have to rely on weak evidence—but “the evidence is weak” is not an argument for privileging my own unsubstantiated position.
I don’t feel the need to collect weak evidence … I should, in this case. I was following a heuristic of not collecting weak evidence (waste of effort) without noticing that there was no stronger evidence.
Why are people’s beliefs of any value? Everyone has the ability to reason. All (non-perfect) reasoners fail in some way or another; if I look at many (controlling for biased reasoning) it gives me more of a chance to spot the biases—I have a control to compare it to.
This case is a special case; some people do have evidence. They’ve read the basilisk, applied their reasoning and logic, and deduced that it is / is not dangerous. These peoples’ beliefs are to be privileged over people who have not read the basilisk. I can’t access private signals like that—I don’t want to read a potential basilisk. So I make a guess at how strong their private signal is (this is why I care about their rationality) and use that as weak evidence for or against.
If seeking harder evidence wasn’t dangerous (and it usually isn’t) I would have done that instead.
Yes. In the absence of actual evidence (which seems dangerous to gather in the case of this basilisk), I pretty much have to go by expressed opinions. To my mind, it was like trying to count the results of experiments that haven’t been performed yet.
I did not seek out more information because it was a throwaway line in an argument attempting to explain to people why it appears their voices are being ignored. I personally am on the side of censoring the idea, not having understood it at all when it first posted, and that may have bled into my posts (I should have exercised stronger control over that) but I am not arguing for censorship. I am arguing why, when someone says “it’s not dangerous!”, some people aren’t coming around to their perspective.
I don’t intend to argue for the censorship of the idea unless sorely pressed.
A few things:
** I’m confused. On the one hand, you say knowing the popularity of various positions is important to you in deciding your own beliefs about something potentially dangerous to you and others. On the other hand, you say it’s not worth seeking more information about and was just a throwaway line in an argument. I am having a hard time reconciling those two claims… you seem to be trying to have it both ways. I suspect I’ve misunderstood something important.
** I didn’t think you were arguing for censorship. Or against it. Actually, I have long since lost track of what most participants in this thread are arguing for, and in some cases I’m not sure they themselves know.
** I agree with you that the existence of knowledgeable people who think something is dangerous is evidence that it’s dangerous.
** Since it seems to matter: for my own part, I rate the expected dangerousness of “the basilisk” very low, and the social cost to the group of the dispute over “censoring” it significantly higher but still low.
** I cannot see why that should be of any evidentiary value whatsoever, to you or anyone else. Whether I’m right or wrong, my position is a pretty easy-to-reach one; it’s the one you arrive at in the absence of other salient beliefs (like, for example, the belief that EY/SIAI is a highly reliable estimator of potential harm done by “basilisks” in general, or the belief that the specific argument for the harmfulness of this basilisk is compelling). And most newcomers will lack those other beliefs. So I expect that quite a few people share my position—far more than 50% -- but I can’t see why you ought to find that fact compelling. That a belief is very widely shared among many many people like me who don’t know much about the topic isn’t much evidence for anything.
Sometimes that isn’t a bad state to be in. Not having an agenda to serve frees up the mind somewhat! :)
(nods) I’m a great believer in it. Especially in cases where a disagreement has picked up momentum, and recognizable factions have started forming… for example, if people start suggesting that those who side with the other team should leave the group. My confidence in my ability to evaluate an argument honestly goes up when I genuinely don’t know what team that argument is playing for.
I suspect I’ve obfuscated it, actually. The popularity of various positions is not intrinsically important to me—in fact, I give professions of believe about as little credit as I can get away with. This specific case is such that every form of evidence I find stronger (reasoning through the argument logically for flaws; statistical evidence about its danger) is not available. With a dearth of stronger evidence, I have to rely on weak evidence—but “the evidence is weak” is not an argument for privileging my own unsubstantiated position.
I don’t feel the need to collect weak evidence … I should, in this case. I was following a heuristic of not collecting weak evidence (waste of effort) without noticing that there was no stronger evidence.
Why are people’s beliefs of any value? Everyone has the ability to reason. All (non-perfect) reasoners fail in some way or another; if I look at many (controlling for biased reasoning) it gives me more of a chance to spot the biases—I have a control to compare it to.
This case is a special case; some people do have evidence. They’ve read the basilisk, applied their reasoning and logic, and deduced that it is / is not dangerous. These peoples’ beliefs are to be privileged over people who have not read the basilisk. I can’t access private signals like that—I don’t want to read a potential basilisk. So I make a guess at how strong their private signal is (this is why I care about their rationality) and use that as weak evidence for or against.
If seeking harder evidence wasn’t dangerous (and it usually isn’t) I would have done that instead.