You are being humorous, but here is the answer to your question: People are talking about it obliquely because they want to talk about it openly, but don’t believe they can, without having their discussions disappear.
LW is not a police state. Discussions are free and fearless, except for this one thing. And of course that makes people even more curious to test the boundaries and understand why, on this one topic, the otherwise sensible moderators think that “you can’t handle the truth”.
We can seek a very loose historical analogy in the early days of nanotechnology. Somewhere I read that for several years, Eric Drexler was inhibited in talking about the concept, because he feared nanotechnology’s destructive side. I don’t know what actually happened at all, so let ’s just be completely hypothetical. It’s the early 1970s, and you’re part of a little group who stumbled upon the idea of molecular machines. There are arguments that such machines could make abundance and immortality possible. There are also arguments that such machines could destroy the world. In the group, there are people who want to tell the world about nanotechnology, because of the first possibility; there are people who want to keep it all a secret, because of the second possibility; and there are people who are undecided or with intermediate positions.
Now suppose we ask the question: Are the world-destroying nanomachines even possible? The nano-secrecy faction would want to inhibit public consideration of that question. But the nano-missionary faction might want to encourage such discussion, either to help the nano-secrecy faction get over its fears, or just to make further secrecy impossible.
In such a situation, it would be very easy for the little group of nano-pioneers to get twisted and conflicted over this topic, in a way which to an outsider would look like a collective neurosis. The key structural element is that there is no-one outside the group presently competent to answer the question of whether the world-destroying nanomachines are physically possible. If they went to an engineer or a physicist or a chemist, first they would have to explain the problem—introduce the concept of a nanomachine, then the concept of a world-destroying nanomachine—before this external authority could begin to solve it.
The deep reason why LW has this nervous tic when it comes to discussion of the forbidden topic, is that it is bound up with a theoretical preoccupation of the moderators, namely, acausal decision theory.
In my 1970s scenario, the nano-pioneers believe that the only way to know whether grey goo is physically possible or not is to develop the true (physically correct) theory of possible nanomachines; and the nano-secrecy faction believes that, until this is done, the safe course of action is to avoid discussing the details in public.
Analogously, it seems that here in the real world of the 2010s, the handful of people on this site who are working to develop a formal acausal decision theory believe that the only way to know whether [scary idea] is actually possible, is to finish developing the theory; and a pro-secrecy faction has the upper hand on how to deal with the issue publicly until that is done.
Returning to the hypothetical scenario of the nano-pioneers, one can imagine the nano-secrecy faction also arguing for secrecy on the grounds that some people find the idea of grey goo terrifying or distressing. In the present situation, that is analogous to the argument for censorship on the grounds that [scary idea] has indeed scared some people. In both cases, it’s even a little convenient—for the pro-secrecy faction—to have public discussion focus on this point, because it directs people away from the conceptual root of the problem.
In my opinion, unlike grey goo, the scary idea arising from acausal decision theory is an illusion, and the theorists who are afraid of it and cautious about discussing it are actually retarding the development of the theory. If they were to state, publicly, completely, and to the best of their ability, what it is that they’re so afraid of, I believe the rest of us would be able to demonstrate that, in the terminology of JoshuaZ, there is no basilisk, there’s only a pseudo-basilisk, at least for human beings.
Well, that was a much more in-depth reply than I was expecting. I had actually been trying to point out that any pro-censorship person who spoke about this idea, ever, for any reason, even to justify the censorship, was actually slitting their own wrists by magnifying it’s exposure. But this was a very interesting reply, sparked some new thoughts for me. Thanks!
You are being humorous, but here is the answer to your question: People are talking about it obliquely because they want to talk about it openly, but don’t believe they can, without having their discussions disappear.
LW is not a police state. Discussions are free and fearless, except for this one thing. And of course that makes people even more curious to test the boundaries and understand why, on this one topic, the otherwise sensible moderators think that “you can’t handle the truth”.
We can seek a very loose historical analogy in the early days of nanotechnology. Somewhere I read that for several years, Eric Drexler was inhibited in talking about the concept, because he feared nanotechnology’s destructive side. I don’t know what actually happened at all, so let ’s just be completely hypothetical. It’s the early 1970s, and you’re part of a little group who stumbled upon the idea of molecular machines. There are arguments that such machines could make abundance and immortality possible. There are also arguments that such machines could destroy the world. In the group, there are people who want to tell the world about nanotechnology, because of the first possibility; there are people who want to keep it all a secret, because of the second possibility; and there are people who are undecided or with intermediate positions.
Now suppose we ask the question: Are the world-destroying nanomachines even possible? The nano-secrecy faction would want to inhibit public consideration of that question. But the nano-missionary faction might want to encourage such discussion, either to help the nano-secrecy faction get over its fears, or just to make further secrecy impossible.
In such a situation, it would be very easy for the little group of nano-pioneers to get twisted and conflicted over this topic, in a way which to an outsider would look like a collective neurosis. The key structural element is that there is no-one outside the group presently competent to answer the question of whether the world-destroying nanomachines are physically possible. If they went to an engineer or a physicist or a chemist, first they would have to explain the problem—introduce the concept of a nanomachine, then the concept of a world-destroying nanomachine—before this external authority could begin to solve it.
The deep reason why LW has this nervous tic when it comes to discussion of the forbidden topic, is that it is bound up with a theoretical preoccupation of the moderators, namely, acausal decision theory.
In my 1970s scenario, the nano-pioneers believe that the only way to know whether grey goo is physically possible or not is to develop the true (physically correct) theory of possible nanomachines; and the nano-secrecy faction believes that, until this is done, the safe course of action is to avoid discussing the details in public.
Analogously, it seems that here in the real world of the 2010s, the handful of people on this site who are working to develop a formal acausal decision theory believe that the only way to know whether [scary idea] is actually possible, is to finish developing the theory; and a pro-secrecy faction has the upper hand on how to deal with the issue publicly until that is done.
Returning to the hypothetical scenario of the nano-pioneers, one can imagine the nano-secrecy faction also arguing for secrecy on the grounds that some people find the idea of grey goo terrifying or distressing. In the present situation, that is analogous to the argument for censorship on the grounds that [scary idea] has indeed scared some people. In both cases, it’s even a little convenient—for the pro-secrecy faction—to have public discussion focus on this point, because it directs people away from the conceptual root of the problem.
In my opinion, unlike grey goo, the scary idea arising from acausal decision theory is an illusion, and the theorists who are afraid of it and cautious about discussing it are actually retarding the development of the theory. If they were to state, publicly, completely, and to the best of their ability, what it is that they’re so afraid of, I believe the rest of us would be able to demonstrate that, in the terminology of JoshuaZ, there is no basilisk, there’s only a pseudo-basilisk, at least for human beings.
Well, that was a much more in-depth reply than I was expecting. I had actually been trying to point out that any pro-censorship person who spoke about this idea, ever, for any reason, even to justify the censorship, was actually slitting their own wrists by magnifying it’s exposure. But this was a very interesting reply, sparked some new thoughts for me. Thanks!
I love this post