I’ll try to keep it civil. I get the feeling the site is as far removed from the site’s founding goals and members as a way to striate the site’s current readership. Either pay into a training seminar through one of the institutions advertised above, or be left behind to bicker over minutia in an underinformed fashion. That said, nobody can doubt the usefulness in personal study, though it is slow and unguided.
I’m suspicious, of the current motives here, of the atmosphere this site provides. I guess it can’t be helped since MIRI and CFAR are at the mercy of needing revenue just like any other institution. So where does one draw the line between helpful guidance and malevolent exploitation?
Can you please clarify whose motives you’re talking about, and generally be a lot more specific with your criticisms? Websites don’t have motives. CFAR and MIRI don’t run this website although of course they have influence. (In point of fact I think it would be more realistic to say nobody runs this website, in the sense that it is largely in ‘maintenance mode’ and administrator changes/interventions tend to be very minimal and occasional.)
I think that what you say is true, although I’m unsure that the dichotomy you provide is correct.
Personally, I see great value in a Schelling point that tried to advance rationality. I don’t think the current LW structure is optimal, and I also agree that there’s not enough structure to help people learning ease into these ideas, or provide avenues of exploration.
I also don’t think that CFAR/MIRI have been heavily using LW as a place for advertisement, outside of their fundraising goals, but I’ve also not been here too long to really say. Feel free to correct me with more evidence.
Towards the end of improving materials on rationality, I’ve been thinking about what a collective attempt to provide a more practical sequel to the Sequences might look like. CFAR’s curriculum feels like it still only captures a small swath of all of rationality space. I’m thinking something like a more systematic long-form attempt to teach skills, where we could source quick feedback from people on this site.
Low-quality thought-vomiting, eh?
I’ll try to keep it civil. I get the feeling the site is as far removed from the site’s founding goals and members as a way to striate the site’s current readership. Either pay into a training seminar through one of the institutions advertised above, or be left behind to bicker over minutia in an underinformed fashion. That said, nobody can doubt the usefulness in personal study, though it is slow and unguided.
I’m suspicious, of the current motives here, of the atmosphere this site provides. I guess it can’t be helped since MIRI and CFAR are at the mercy of needing revenue just like any other institution. So where does one draw the line between helpful guidance and malevolent exploitation?
Can you please clarify whose motives you’re talking about, and generally be a lot more specific with your criticisms? Websites don’t have motives. CFAR and MIRI don’t run this website although of course they have influence. (In point of fact I think it would be more realistic to say nobody runs this website, in the sense that it is largely in ‘maintenance mode’ and administrator changes/interventions tend to be very minimal and occasional.)
I think that what you say is true, although I’m unsure that the dichotomy you provide is correct.
Personally, I see great value in a Schelling point that tried to advance rationality. I don’t think the current LW structure is optimal, and I also agree that there’s not enough structure to help people learning ease into these ideas, or provide avenues of exploration.
I also don’t think that CFAR/MIRI have been heavily using LW as a place for advertisement, outside of their fundraising goals, but I’ve also not been here too long to really say. Feel free to correct me with more evidence.
Towards the end of improving materials on rationality, I’ve been thinking about what a collective attempt to provide a more practical sequel to the Sequences might look like. CFAR’s curriculum feels like it still only captures a small swath of all of rationality space. I’m thinking something like a more systematic long-form attempt to teach skills, where we could source quick feedback from people on this site.