Unless you include stuff like game theory or mind upload pontification in “AI”, I’m seeing less than 10 % of the new posts on either discussion or main as something that could be categorized as AI. Also, I see very little of the sort of very interesting AI discussion like starglider’s threads on StarDestroyer.net on the site in general, it’s mostly “here’s a news link about AI stuff” and the occasional interview with a mainstream AI researcher. So not really seeing the need for this, since I’m not seeing the implied half of the majority of the topics being AI.
I’d like to see some proper AI threads too. I’ve no idea what kind of general research framework the current cutting-edge implementation stuff, such as whatever Google and IBM are cooking up, is operating, and whether there are any promising new theoretical approaches towards AGI.
ETA: A division I might support could be something in the lines of “practical rationality with essays and techniques” versus “theoretical crunch with lots of math”. These two do seem to be both represented on the site, valuable, and potentially possessing significantly non-overlapping readerships. Having the more technical and rigorous theory stuff in its own section might encourage more of it, and the separate sections could develop different conversation styles if needed.
It doesn’t have to be half of the posts—if it’s approaching 10%, then that’s 10% of the posts here being blatantly off-topic. It irks me to no end.
And I’m not sure if you’re counting links to stuff about the Singularity, or other such off-topic things.
If a post doesn’t at least pretend to be about either epistemic rationality, instrumental rationality, or both, then I don’t want to see it on Less Wrong.
Unless you include stuff like game theory or mind upload pontification in “AI”, I’m seeing less than 10 % of the new posts on either discussion or main as something that could be categorized as AI. Also, I see very little of the sort of very interesting AI discussion like starglider’s threads on StarDestroyer.net on the site in general, it’s mostly “here’s a news link about AI stuff” and the occasional interview with a mainstream AI researcher. So not really seeing the need for this, since I’m not seeing the implied half of the majority of the topics being AI.
I’d like to see some proper AI threads too. I’ve no idea what kind of general research framework the current cutting-edge implementation stuff, such as whatever Google and IBM are cooking up, is operating, and whether there are any promising new theoretical approaches towards AGI.
ETA: A division I might support could be something in the lines of “practical rationality with essays and techniques” versus “theoretical crunch with lots of math”. These two do seem to be both represented on the site, valuable, and potentially possessing significantly non-overlapping readerships. Having the more technical and rigorous theory stuff in its own section might encourage more of it, and the separate sections could develop different conversation styles if needed.
It doesn’t have to be half of the posts—if it’s approaching 10%, then that’s 10% of the posts here being blatantly off-topic. It irks me to no end.
And I’m not sure if you’re counting links to stuff about the Singularity, or other such off-topic things.
If a post doesn’t at least pretend to be about either epistemic rationality, instrumental rationality, or both, then I don’t want to see it on Less Wrong.