Near as I can tell LW ‘faded’ because it was initially created to sneak obsession with a particular flavor of AI apocalypse into the minds of as wide an audience as possible under the guise of a rationality movement so as to make money for the singularity institute, and now there are actual institutions which are more lucrative potential funding sources. That, and the ai safety subtext that underlaid everything has gone nowhere at the institute itself to all outward appearances.
Interesting place to talk about interesting things with interesting people though! Keep that around for sure!
Near as I can tell LW ‘faded’ because it was initially created to sneak
fnord
obsession
fnord
with a particular flavor of AI apocalypse into the minds of as wide an audience as possible under the guise
fnord
of a rationality movement so as to make money
fnord
for the singularity institute, and now there are actual
fnord
institutions which are more lucrative
fnord
potential funding sources. That, and the ai safety subtext
fnord
that underlaid everything has gone nowhere at the institute itself to all outward appearances.
To put that another way, Eliezer founded LessWrong in order to attract people capable of understanding and learning from the material that he posted, in the hope that some few of them might be capable of functioning at the level he considered a prerequisite to work on the problem of FAI. I believe he said this at the time. And it worked. LessWrong operated exactly as it was intended. MIRI and CFAR have their own traction now and LessWrong has completed its function. It still has a certain usefulness as a community locus, a discussion forum, and a place for relevant announcements, but it cannot be again what it was in the days of the Sequences.
Near as I can tell LW ‘faded’ because it was initially created to sneak obsession with a particular flavor of AI apocalypse into the minds of as wide an audience as possible under the guise of a rationality movement so as to make money for the singularity institute, and now there are actual institutions which are more lucrative potential funding sources. That, and the ai safety subtext that underlaid everything has gone nowhere at the institute itself to all outward appearances.
Interesting place to talk about interesting things with interesting people though! Keep that around for sure!
fnord
fnord
fnord
fnord
fnord
fnord
fnord
To put that another way, Eliezer founded LessWrong in order to attract people capable of understanding and learning from the material that he posted, in the hope that some few of them might be capable of functioning at the level he considered a prerequisite to work on the problem of FAI. I believe he said this at the time. And it worked. LessWrong operated exactly as it was intended. MIRI and CFAR have their own traction now and LessWrong has completed its function. It still has a certain usefulness as a community locus, a discussion forum, and a place for relevant announcements, but it cannot be again what it was in the days of the Sequences.
Those aren’t fnords, they’re overt criticisms.
I agree those are fnords (unnecessary negative affect words used together to produce a signal).
I don’t disagree with the underlying criticism if reexpressed in neutral language, however.