Well, if someone literally said “I am joining a very cult-like group that I don’t consider to be a cult”, wouldn’t it be much more likely that they are in fact joining a cult than the baseline probability of such? (Which is very low—very small fraction of people are at any moment literally in the process of joining a cult).
It’s that this ironic statement acknowledges that the group is very much like a cult or is described as a cult and what they’re doing is very much like what a person joining a cult does, but for some reason they don’t believe it to be a cult.
In those words? Yes. You may note that those are different words than Alicorn’s, or any of mine.
ETA: Wow, got seriously ninjaed there. I’ll expand. It’s not the “I don’t consider this a cult” part of the message that’d make me update away from the surface meaning so much as the ”...and I expect you to get the joke” part. That trades on information, even if you don’t know it, that the speaker expects you to know. The speaker believes not only that they’re not joining a cult but that it’s obvious they’re not, or at most clear after a moment’s thought; otherwise it wouldn’t be funny.
That trades on information, even if you don’t know it, that the speaker expects you to know. The speaker believes not only that they’re not joining a cult but that it’s obvious they’re not, or at most clear after a moment’s thought; otherwise it wouldn’t be funny.
Well, if the speaker got a job at Google or McDonalds, it would be far more obvious that they’re not joining a doomsday cult… yet it seems to me that they wouldn’t be joking it’s a doomsday cult out of the blue then. It’s when it is a probable doomsday cult that you try to argue it isn’t by hoping that others laugh along with you.
It’s when it is a probable doomsday cult that you try to argue it isn’t by hoping that others laugh along with you.
Not in my experience. If people are scared that they’re doing something potentially life-ruining like joining a cult—and my first college roommate did drop out to join an ashram, so I know whereof I speak—they don’t draw attention to it by joking about it. They argue, or they deflect, or they clam up.
I’d expect the number of people who joined doomsday cults and made jokes like Alicorn’s to be approximately zero.
If people are scared that they’re doing something potentially life-ruining
...
I’d expect the number of people who joined doomsday cults and made jokes like Alicorn’s to be approximately zero.
I would be very surprised if this was true. My experience mirrors what Jiro said—people tend to joke about things that scare them. Of course, some would clam up (keep in mind that a clammed up individual may have joked about it before and the joke was not well received, or may be better able to evaluate the lack of humour in such jokes)
Okay, they joke about it. Just not the kind of joke that draws attention to the thing they’re worried about; it’d be too close to home, like making a dead baby joke at a funeral. Jokes minimizing or exaggerating the situation—a type of deflection—are more likely; Kool-Aid jokes wouldn’t be out of the question, for example.
Well, presumably one who’s joining a doomsday cult is most worried about the doomsday (and would be relieved if it was just a bullshit doomsday cult). So wouldn’t that be a case of jokes minimizing the situation as it exists in the speaker’s mind? The reason that NORAD joke of yours is funny to either of us, is that we both believe it can actually cause an extreme catastrophe, which is uncomfortable for us. Why wouldn’t a similar joke referencing a false doomsday not be funny to one who believes in said false belief as strongly as we believe in nuclear weapons?
Well, if someone ironically says that they are “dropping out of school to join a doomsday cult” (and they are actually dropping out of school to join something), they got to be joining something that has something to do with a doomsday, rather than, say, another school, or a normal job, or the like.
There’s a lot of doomsdays out there. My first assumption, if I was talking to someone outside core rationalist demographics, would probably be climate change advocacy or something along those lines—though I’d probably find it funnier if they were joining NORAD.
Well, you start with a set containing google, mcdonalds, and all other organizations one could be joining, inclusive of all doomsday cults, and then you end up with a much smaller set of organizations, inclusive of all doomsday cults. Which ought to boost the probability of them joining an actual doomsday cult, even if said probability would arguably remain below 0.5 or 0.9 or what ever threshold of credence.
Yes, I understand the statistics you’re trying to point to. I just don’t think it’s as simple as narrowing down the reference class. I expect material differences in behavior between the cases “joining a doomsday cult or something that could reasonably be mistaken for one” and “joining something that kinda looks enough like a doomsday cult that jokes about it are funny, but which isn’t”, and those differences mean that this can’t be solved by a single application of Bayes’ Rule.
Maybe your probability estimate ends up higher by epsilon or so. That depends on all sorts of fuzzy readings of context and estimations of the speaker’s character, far too fuzzy for me to do actual math to it. But I feel fairly confident in saying that it shouldn’t adjust that estimate enough to justify taking any sort of action, which is what actually matters here.
Well, a doomsday cult is not only a doomsday cult but also kinda looks enough like a doomsday cult, too. Of people joining something that kinda looks enough like a doomsday cult, some are joining an actual doomsday cult. Those people, do they, in your model, know that they’re joining a doomsday cult, so they can avoid joking about it?
Well, if someone literally said “I am joining a very cult-like group that I don’t consider to be a cult”, wouldn’t it be much more likely that they are in fact joining a cult than the baseline probability of such? (Which is very low—very small fraction of people are at any moment literally in the process of joining a cult).
It’s that this ironic statement acknowledges that the group is very much like a cult or is described as a cult and what they’re doing is very much like what a person joining a cult does, but for some reason they don’t believe it to be a cult.
In those words? Yes. You may note that those are different words than Alicorn’s, or any of mine.
ETA: Wow, got seriously ninjaed there. I’ll expand. It’s not the “I don’t consider this a cult” part of the message that’d make me update away from the surface meaning so much as the ”...and I expect you to get the joke” part. That trades on information, even if you don’t know it, that the speaker expects you to know. The speaker believes not only that they’re not joining a cult but that it’s obvious they’re not, or at most clear after a moment’s thought; otherwise it wouldn’t be funny.
Well, if the speaker got a job at Google or McDonalds, it would be far more obvious that they’re not joining a doomsday cult… yet it seems to me that they wouldn’t be joking it’s a doomsday cult out of the blue then. It’s when it is a probable doomsday cult that you try to argue it isn’t by hoping that others laugh along with you.
Not in my experience. If people are scared that they’re doing something potentially life-ruining like joining a cult—and my first college roommate did drop out to join an ashram, so I know whereof I speak—they don’t draw attention to it by joking about it. They argue, or they deflect, or they clam up.
I’d expect the number of people who joined doomsday cults and made jokes like Alicorn’s to be approximately zero.
...
I would be very surprised if this was true. My experience mirrors what Jiro said—people tend to joke about things that scare them. Of course, some would clam up (keep in mind that a clammed up individual may have joked about it before and the joke was not well received, or may be better able to evaluate the lack of humour in such jokes)
Okay, they joke about it. Just not the kind of joke that draws attention to the thing they’re worried about; it’d be too close to home, like making a dead baby joke at a funeral. Jokes minimizing or exaggerating the situation—a type of deflection—are more likely; Kool-Aid jokes wouldn’t be out of the question, for example.
Why the ellipsis?
Well, presumably one who’s joining a doomsday cult is most worried about the doomsday (and would be relieved if it was just a bullshit doomsday cult). So wouldn’t that be a case of jokes minimizing the situation as it exists in the speaker’s mind? The reason that NORAD joke of yours is funny to either of us, is that we both believe it can actually cause an extreme catastrophe, which is uncomfortable for us. Why wouldn’t a similar joke referencing a false doomsday not be funny to one who believes in said false belief as strongly as we believe in nuclear weapons?
To indicate that a part was omitted.
Well, if someone ironically says that they are “dropping out of school to join a doomsday cult” (and they are actually dropping out of school to join something), they got to be joining something that has something to do with a doomsday, rather than, say, another school, or a normal job, or the like.
There’s a lot of doomsdays out there. My first assumption, if I was talking to someone outside core rationalist demographics, would probably be climate change advocacy or something along those lines—though I’d probably find it funnier if they were joining NORAD.
Well, you start with a set containing google, mcdonalds, and all other organizations one could be joining, inclusive of all doomsday cults, and then you end up with a much smaller set of organizations, inclusive of all doomsday cults. Which ought to boost the probability of them joining an actual doomsday cult, even if said probability would arguably remain below 0.5 or 0.9 or what ever threshold of credence.
Yes, I understand the statistics you’re trying to point to. I just don’t think it’s as simple as narrowing down the reference class. I expect material differences in behavior between the cases “joining a doomsday cult or something that could reasonably be mistaken for one” and “joining something that kinda looks enough like a doomsday cult that jokes about it are funny, but which isn’t”, and those differences mean that this can’t be solved by a single application of Bayes’ Rule.
Maybe your probability estimate ends up higher by epsilon or so. That depends on all sorts of fuzzy readings of context and estimations of the speaker’s character, far too fuzzy for me to do actual math to it. But I feel fairly confident in saying that it shouldn’t adjust that estimate enough to justify taking any sort of action, which is what actually matters here.
Well, a doomsday cult is not only a doomsday cult but also kinda looks enough like a doomsday cult, too. Of people joining something that kinda looks enough like a doomsday cult, some are joining an actual doomsday cult. Those people, do they, in your model, know that they’re joining a doomsday cult, so they can avoid joking about it?