Oh. I see why the post is being downvoted as well. I’m being forced to address multiple audiences with different requirements by a nearly universal inclination to look for anything to justify criticism or downvoting—particularly since I’m rocking the boat or perceived as a newbie.
I’m a firm believer in Crocker’s Rules for myself but think that LessWrong and the SIAI have made huge mistakes in creating an echo chamber which slows/stifles the creation of new ideas and the location of errors in old ideas as well as alienating many, many potential allies.
I think we’re seeing different reasons. I think you’re being downvoted because people think you’re wrong, and you think you’re being downvoted because people think you’re right.
I’m being forced to address multiple audiences with different requirements by a nearly universal inclination to look for anything to justify criticism or downvoting—particularly since I’m rocking the boat or perceived as a newbie.
That hypothesis fails to account for all of the dataset: a lot of top-level posts don’t get downvoted to oblivion, even when they’re “rocking the boat” more than you are: see this and this.
I don’t perceive you as “rocking the boat”; I don’t understand enough of what you’re trying to say to tell whether I agree or not. I don’t think you’re more confused or less clear than the average lesswronger, however your top-levels posts on ethics and Friendly AI come off as more confused/confusing than the average top-level post on ethics and Friendly AI, most of which were written by Eliezer.
I don’t know if the perceived confusion comes from the fact that your own thinking is confused, that your thinking is clear but your writing is unclear, or that I myself am confused or biased in some way. There is a lot of writing that falls into that category (Foucault and Derrida come to mind), and I don’t consider it a worthwhile use of my time to try to figure it out, as there is also a large supply of clear writing available.
Oh. I see why the post is being downvoted as well. I’m being forced to address multiple audiences with different requirements by a nearly universal inclination to look for anything to justify criticism or downvoting—particularly since I’m rocking the boat or perceived as a newbie.
I’m a firm believer in Crocker’s Rules for myself but think that LessWrong and the SIAI have made huge mistakes in creating an echo chamber which slows/stifles the creation of new ideas and the location of errors in old ideas as well as alienating many, many potential allies.
I think we’re seeing different reasons. I think you’re being downvoted because people think you’re wrong, and you think you’re being downvoted because people think you’re right.
That hypothesis fails to account for all of the dataset: a lot of top-level posts don’t get downvoted to oblivion, even when they’re “rocking the boat” more than you are: see this and this.
I don’t perceive you as “rocking the boat”; I don’t understand enough of what you’re trying to say to tell whether I agree or not. I don’t think you’re more confused or less clear than the average lesswronger, however your top-levels posts on ethics and Friendly AI come off as more confused/confusing than the average top-level post on ethics and Friendly AI, most of which were written by Eliezer.
I don’t know if the perceived confusion comes from the fact that your own thinking is confused, that your thinking is clear but your writing is unclear, or that I myself am confused or biased in some way. There is a lot of writing that falls into that category (Foucault and Derrida come to mind), and I don’t consider it a worthwhile use of my time to try to figure it out, as there is also a large supply of clear writing available.