Fair enough, but it is equally incomplete to pretend that that’s an argument against the possibility of singularity-grade technology emerging in the foreseeable future.
By analogy, there have been many people who had crazy beliefs about radioactivity: doctors who prescribed radium as medicine, seemingly on the grounds that it was cool, and anything cool has to be good for you right? (A similar mentality led some of the ancient Chinese to drink mercury.) Atomic maximalists, who thought that anything and everything would get better with a reactor strapped to it, and never mind the price of uranium, the need for radiation shielding or the fact that reactors are heavy due both to the need for cooling and power generation systems and the simple fact that they benefit greatly from economies of scale. Not the sort of thing that you necessarily want to bolt onto every car and aircraft! Atom-phobes who were convinced that any attempt to utilize nuclear power would automatically become the next Chernobyl.
All of these were crazy, cult-like beliefs. Yet the insanity of people who turned poorly-understood scraps of nuclear theory into unreasoning optimism or pessimism does not have a single thing to say on the reality of radioactivity. Atomic bombs and nuclear reactors still work, no matter how foolish the radium suppository crowd of the early twentieth century was. And they still have sharp limits, no matter how crazily enthusiastic the “atomic cars in twenty years” crowd was.
By all means point out how Ziz’ cult was influenced by singularitarian ideas here. Even point out how the great opportunities and risks that a singularity might bring are a risk factor for cult-style mistakes. But don’t pretend that that prevents advanced technology from existing. Nature simply doesn’t care how we think about it, and isn’t going to make AI impossible just because Ziz had foolish ideas about AI.
But we also can’t pretend that this place is anything but a less extreme spiraling cult of its own, rather than a place that has anything to do with the real world
I think it’s important to be able to point out dynamics that feel unsettling and unhealthy without necessarily being able to pin down exactly what’s wrong. I think it can often be hard to point out what’s wrong, and demanding people explain themselves well in public is often unfair.
That said, the circumstances that normally make me willing to cut-slack-for-bad-argumentation here don’t obviously apply to CellBioGuy.
I also think it’s important to not reflexively shut down people disagreeing with the LW site vague-community-consensus.
That said, the comments here just pretty clearly don’t pass muster as acceptable LW comments in the general case. The amount of slack I’m willing to cut them isn’t infinite (and basically has been used up by this point). If you want to repeatedly argue we’re just a spiraling cult you do need to, at a minimum, supply any argumentation whatsoever, and engage with some object level claims. It’s easy to toss scary sounding labels around.
I liked gears to ascension’s reply, which sort of took seriously the concerns CellBioGuy was pointing at while trying to say something substantive about it.
You’ve been saying a lot of things in this reference class in the past week (i.e. vague cult accusations without making substantive claims), and if that keeps up we’ll probably take some kind of mod action about it, although I haven’t thought it through that much at this point.
Citation very much needed. What, specifically, do you disagree with?
Do you believe that the human mind is magical, such that no computer could ever replicate intelligence? (And never mind the ability it has shown already from chemistry to StarCraft…)
Do you believe that intelligence cannot create better tools than already exist, such that an AI couldn’t use engineering to meaningful effect? How about persuasion?
Do you believe that automation taking over the economy wouldn’t be a big deal? How about taking over genetics research, which is often bottlenecked by an inability to consider how genes interact, precisely something a computer could help with? Or is learning how to alter our very cores no big deal?
Do you have a specific argument against the plausibility or significance of a singularity? Or is this simply pattern matching to a cult without any further thought? Because “this sounds weird; it must be wrong” simply doesn’t work. Flight, nuclear power, genetics-all sounded more like science fiction than any real world possibility.
I think that the vibes here have been unhealthy in ways that made model-challenging evidence seeking harder to think of; for example, yudkowsky’s intense anxiety, dying with dignity needing to be said and also that shock inducing name, etc. Your reaction seems to me to miss the point that I would agree with CellBioGiy about, though I do also think you’re right in every implied rebuttal your questions point to; it’s just that trying to solve safety requires willingness to question deep assumptions and yet retain stability in the face of doing so, and retaining stability is hard and often failed at when taking huge magnitude ratios seriously. Ziz seems to me to have been the most intense example of a general failure pattern in how this intercommunity directs a person’s intention, and thinking about that carefully seems important to me. It’s also something where real progress has been made and I hardly think the critique warrants writing off the site as a whole or anything like that.
I do think calling the whole thing a spiraling cult takes it too far; it’s got elevated levels of cult disease and features that can’t be removed which make it pattern match aesthetically to cult adjacency in ways that may not always be fundamental—real crazy things are coming—but I think that there are in fact concerning thought stopping patterns in the interpersonal anxiety patterns, patterns of who trusts who without ongoing verification, etc.
Fair enough, but it is equally incomplete to pretend that that’s an argument against the possibility of singularity-grade technology emerging in the foreseeable future.
By analogy, there have been many people who had crazy beliefs about radioactivity: doctors who prescribed radium as medicine, seemingly on the grounds that it was cool, and anything cool has to be good for you right? (A similar mentality led some of the ancient Chinese to drink mercury.) Atomic maximalists, who thought that anything and everything would get better with a reactor strapped to it, and never mind the price of uranium, the need for radiation shielding or the fact that reactors are heavy due both to the need for cooling and power generation systems and the simple fact that they benefit greatly from economies of scale. Not the sort of thing that you necessarily want to bolt onto every car and aircraft! Atom-phobes who were convinced that any attempt to utilize nuclear power would automatically become the next Chernobyl.
All of these were crazy, cult-like beliefs. Yet the insanity of people who turned poorly-understood scraps of nuclear theory into unreasoning optimism or pessimism does not have a single thing to say on the reality of radioactivity. Atomic bombs and nuclear reactors still work, no matter how foolish the radium suppository crowd of the early twentieth century was. And they still have sharp limits, no matter how crazily enthusiastic the “atomic cars in twenty years” crowd was.
By all means point out how Ziz’ cult was influenced by singularitarian ideas here. Even point out how the great opportunities and risks that a singularity might bring are a risk factor for cult-style mistakes. But don’t pretend that that prevents advanced technology from existing. Nature simply doesn’t care how we think about it, and isn’t going to make AI impossible just because Ziz had foolish ideas about AI.
But we also can’t pretend that this place is anything but a less extreme spiraling cult of its own, rather than a place that has anything to do with the real world
Commenting with my mod hat on:
I think it’s important to be able to point out dynamics that feel unsettling and unhealthy without necessarily being able to pin down exactly what’s wrong. I think it can often be hard to point out what’s wrong, and demanding people explain themselves well in public is often unfair.
That said, the circumstances that normally make me willing to cut-slack-for-bad-argumentation here don’t obviously apply to CellBioGuy.
I also think it’s important to not reflexively shut down people disagreeing with the LW site vague-community-consensus.
That said, the comments here just pretty clearly don’t pass muster as acceptable LW comments in the general case. The amount of slack I’m willing to cut them isn’t infinite (and basically has been used up by this point). If you want to repeatedly argue we’re just a spiraling cult you do need to, at a minimum, supply any argumentation whatsoever, and engage with some object level claims. It’s easy to toss scary sounding labels around.
I liked gears to ascension’s reply, which sort of took seriously the concerns CellBioGuy was pointing at while trying to say something substantive about it.
You’ve been saying a lot of things in this reference class in the past week (i.e. vague cult accusations without making substantive claims), and if that keeps up we’ll probably take some kind of mod action about it, although I haven’t thought it through that much at this point.
Citation very much needed. What, specifically, do you disagree with?
Do you believe that the human mind is magical, such that no computer could ever replicate intelligence? (And never mind the ability it has shown already from chemistry to StarCraft…)
Do you believe that intelligence cannot create better tools than already exist, such that an AI couldn’t use engineering to meaningful effect? How about persuasion?
Do you believe that automation taking over the economy wouldn’t be a big deal? How about taking over genetics research, which is often bottlenecked by an inability to consider how genes interact, precisely something a computer could help with? Or is learning how to alter our very cores no big deal?
Do you have a specific argument against the plausibility or significance of a singularity? Or is this simply pattern matching to a cult without any further thought? Because “this sounds weird; it must be wrong” simply doesn’t work. Flight, nuclear power, genetics-all sounded more like science fiction than any real world possibility.
I think that the vibes here have been unhealthy in ways that made model-challenging evidence seeking harder to think of; for example, yudkowsky’s intense anxiety, dying with dignity needing to be said and also that shock inducing name, etc. Your reaction seems to me to miss the point that I would agree with CellBioGiy about, though I do also think you’re right in every implied rebuttal your questions point to; it’s just that trying to solve safety requires willingness to question deep assumptions and yet retain stability in the face of doing so, and retaining stability is hard and often failed at when taking huge magnitude ratios seriously. Ziz seems to me to have been the most intense example of a general failure pattern in how this intercommunity directs a person’s intention, and thinking about that carefully seems important to me. It’s also something where real progress has been made and I hardly think the critique warrants writing off the site as a whole or anything like that.
I do think calling the whole thing a spiraling cult takes it too far; it’s got elevated levels of cult disease and features that can’t be removed which make it pattern match aesthetically to cult adjacency in ways that may not always be fundamental—real crazy things are coming—but I think that there are in fact concerning thought stopping patterns in the interpersonal anxiety patterns, patterns of who trusts who without ongoing verification, etc.