Like you (I presume), I would like to see more posts about reasoning and fewer, despite my transhumanist sympathies, about boxed AIs, hypothetical torture scenarios, and the optimality of donating to the Friendly AI cause: focusing our efforts that way is more interesting, more broadly appealing, and ultimately more effective for everyone involved including the SIAI.
Disagree on the “fewer” part. I’m not sure about SIAI, but I think at least my personal interests would not be better served by having fewer transhumanist posts. It might be a good idea to move such posts into a subforum though. (I think supporting such subforums was discussed in the past, but I don’t remember if it hasn’t been done due to lack of resources, or if there’s some downside to the idea.)
Fair enough. It ultimately comes down to whether or not tickling transhumanists’ brains wins us more than we’d gain from appearing however more approachable to non-transhumanist rationalists, and there’s enough unquantified values in that equation to leave room for disagreement. In a world where a magazine as poppy and mainstream as TIME likes to publish articles on the Singularity, I could easily be wrong.
I stand by my statements when it comes to SIAI-specific values, though.
Disagree on the “fewer” part. I’m not sure about SIAI, but I think at least my personal interests would not be better served by having fewer transhumanist posts. It might be a good idea to move such posts into a subforum though. (I think supporting such subforums was discussed in the past, but I don’t remember if it hasn’t been done due to lack of resources, or if there’s some downside to the idea.)
Fair enough. It ultimately comes down to whether or not tickling transhumanists’ brains wins us more than we’d gain from appearing however more approachable to non-transhumanist rationalists, and there’s enough unquantified values in that equation to leave room for disagreement. In a world where a magazine as poppy and mainstream as TIME likes to publish articles on the Singularity, I could easily be wrong.
I stand by my statements when it comes to SIAI-specific values, though.