There is no big scary secret. The only danger to worry about is that this community of schizo scifi nerds is going to have some perceptible and negative influence by spreading and popularizing their bullshit. Which will mainly be a problem for the computer science community, especially AI research, since those people are naturally susceptible to such infections.
But I am not too worried about that either. If all people who buy all this bullshit stop working on AI then maybe that will renovate the field and actually allow some real progress to take place by giving new ideas a chance and by introducing new perspectives which are less deluded by science fictional ideas. In a sense lesswrong/SIAI might function as a crackpot attractor, stripping out all negative elements so that actual progress can take place.
The only danger to worry about is that this community of schizo scifi nerds is going to
Alicorn, if a “should this be moderated” poll is required anywhere in this thread this is the kind of trolling that needs to be targeted. Way across the line.
The quoted sentence refers to the weapon, not the event from which it’s shaped (through misrepresentation or motivated misinterpretation). Even community voting that hides comments that happen to be critical is being used as fuel for accusations of censorship.
I suggest that if this is applicable to any behavior in this thread it is to the actual trolling, not Will just being a crackpot.
A fraction of my comments are outright critical and I am only posting a few comments per week. There have been dozens of highly critical comments lately not made by me. Some of them containing direct personal attacks.
If you really perceive the few harsh comments that I make, that reflect a widely held opinion, to be too much then you lost all touch with reality and require much more criticism than I could deliver.
Wait a few more years and the shitstorm is going to increase by orders of magnitude and I won’t even be part of it.
Do you really believe that you can get away with your attitude? Be prepared to be surprised.
And stop calling everything “trolling”. It’s really getting boring.
Alicorn, if a “should this be moderated” poll is required anywhere in this thread this is the kind of trolling that needs to be targeted. Way across the line.
What is way across the line is when people start asking about “secretes” and basilisks and there is any chance of such possibilities being taken seriously. What is way across the line is when an organisations tries to actively impede research.
Some harsh words are completely appropriate then.
I have no problem with Will Newsome and find a lot of his output enjoyable. But if he starts to lend credibility to crazy shit like basilisks in the minds of people then that has to be said.
I endorse you not worrying about SI impeding AI progress on any significant scale. I would also endorse, if you’re genuinely interested in encouraging AI research, devoting more of your attention to the problems that are actually impeding AI progress.
There is no big scary secret. The only danger to worry about is that this community of schizo scifi nerds is going to have some perceptible and negative influence by spreading and popularizing their bullshit. Which will mainly be a problem for the computer science community, especially AI research, since those people are naturally susceptible to such infections.
But I am not too worried about that either. If all people who buy all this bullshit stop working on AI then maybe that will renovate the field and actually allow some real progress to take place by giving new ideas a chance and by introducing new perspectives which are less deluded by science fictional ideas. In a sense lesswrong/SIAI might function as a crackpot attractor, stripping out all negative elements so that actual progress can take place.
Alicorn, if a “should this be moderated” poll is required anywhere in this thread this is the kind of trolling that needs to be targeted. Way across the line.
It’s rare enough that handing people with those sentiments the weapon of “any dissent is immediately silenced” is worse than the disease.
Not remotely suggested. Since when does “immediately” mean “after a spiraling trend over a couple of years”?
I suggest that if this is applicable to any behavior in this thread it is to the actual trolling, not Will just being a crackpot.
The quoted sentence refers to the weapon, not the event from which it’s shaped (through misrepresentation or motivated misinterpretation). Even community voting that hides comments that happen to be critical is being used as fuel for accusations of censorship.
A fraction of my comments are outright critical and I am only posting a few comments per week. There have been dozens of highly critical comments lately not made by me. Some of them containing direct personal attacks.
If you really perceive the few harsh comments that I make, that reflect a widely held opinion, to be too much then you lost all touch with reality and require much more criticism than I could deliver.
Wait a few more years and the shitstorm is going to increase by orders of magnitude and I won’t even be part of it.
Do you really believe that you can get away with your attitude? Be prepared to be surprised.
And stop calling everything “trolling”. It’s really getting boring.
What is way across the line is when people start asking about “secretes” and basilisks and there is any chance of such possibilities being taken seriously. What is way across the line is when an organisations tries to actively impede research.
Some harsh words are completely appropriate then.
I have no problem with Will Newsome and find a lot of his output enjoyable. But if he starts to lend credibility to crazy shit like basilisks in the minds of people then that has to be said.
I endorse you not worrying about SI impeding AI progress on any significant scale.
I would also endorse, if you’re genuinely interested in encouraging AI research, devoting more of your attention to the problems that are actually impeding AI progress.