It seems to me that the single most important thing we could do to make LW more welcoming to new people is to somehow deal with the problem of the neologisms.
(We could try to build a bot that automatically places a link over any neologism, linking to the LW wiki entry, for example.)
The way I would implement this is to add a bot that parses [[wiki-links]] the same way that the wiki parses them.
That way, if someone wants to automatically link to a wiki page for a neologism, they just have to put those brackets around the phrase.
This feature should work both for posts and for comments.
It might also be a good idea for the bot to be able to parse links in this form: [[standard name|alternate name]]. Another option is to create a redirect page on the wiki for any alternative titles for a concept.
Once this is implemented, we can go back through the old posts and add these wiki links wherever it would be appropriate.
We could try to build a bot that automatically places a link over any neologism, linking to the LW wiki entry, for example.
That would be an awesome tool to have available, to simplify cross-linking, but it could easily be annoying. And I get the impression that most of the “neologisms” that cause issues are actually phrases of ordinary words, like “something to protect”, that are supposed to hearken back to a particular concept/argument.
As long as they’re as easy to describe by regular expressions as “something to protect” is, I don’t think the bot should have a problem—especially when, like that one is, they’re the title of a lesswrong wiki page.
Also, most of Eliezer’s posts are heavily cross-linked by concept, even when the phrase doesn’t match a regular expression. We should encourage this writing style.
Well the problem with that is, we’ll end up with links all over the place, instead of just the relevant bits. Problems with the regular expression method as described:
if you use a word, say “Newcomb’s”, a bunch of times, then you’ll have a bunch of links to the same thing.
many of the things we’d like to link to have other meanings, and so will be incorrectly linked; for example, ‘update’, or even “something to protect”.
While the heavy cross-linking in Eliezer’s posts is good to some extent, you don’t want relevant cross-links to be drowned out by automatic ones.
if you use a word, say “Newcomb’s”, a bunch of times, then you’ll have a bunch of links to the same thing.
It should be fairly trivial to set the bot up to only link the first instance of each phrase.
many of the things we’d like to link to have other meanings, and so will be incorrectly linked; for example, ‘update’, or even “something to protect”.
While the heavy cross-linking in Eliezer’s posts is good to some extent, you don’t want relevant cross-links to be drowned out by automatic ones.
These concerns seem like they would be a minor- or non-issue if the system had appropriate features: Writers should have a way of stopping the bot from linking a given instance of a phrase, in particular. This implies that they’d have to know which phrases were being autolinked, which seems like it’d be sensibly solved by having the autolinks show up when the article is previewed before being published.
Please don’t make automatic links opt-out—that’s just going to make it much, much worse for anyone with a slow connection.
Edit: I don’t think it’s a good idea in the first place, anyway—even ignoring that no possible autolinker would add the correct link in this comment, the server load would be severely increased by the change. Wasn’t someone saying that the system was suffering some strain?
It seems to me that the single most important thing we could do to make LW more welcoming to new people is to somehow deal with the problem of the neologisms.
(We could try to build a bot that automatically places a link over any neologism, linking to the LW wiki entry, for example.)
The way I would implement this is to add a bot that parses [[wiki-links]] the same way that the wiki parses them.
That way, if someone wants to automatically link to a wiki page for a neologism, they just have to put those brackets around the phrase.
This feature should work both for posts and for comments.
It might also be a good idea for the bot to be able to parse links in this form: [[standard name|alternate name]]. Another option is to create a redirect page on the wiki for any alternative titles for a concept.
Once this is implemented, we can go back through the old posts and add these wiki links wherever it would be appropriate.
Well, I don’t think this requires a bot; if it’s implemented at all, it could just be in the comment/post—parsing engine.
You wouldn’t even need a bot—just an upgrade to the Markdown syntax.
That would be an awesome tool to have available, to simplify cross-linking, but it could easily be annoying. And I get the impression that most of the “neologisms” that cause issues are actually phrases of ordinary words, like “something to protect”, that are supposed to hearken back to a particular concept/argument.
As long as they’re as easy to describe by regular expressions as “something to protect” is, I don’t think the bot should have a problem—especially when, like that one is, they’re the title of a lesswrong wiki page.
Also, most of Eliezer’s posts are heavily cross-linked by concept, even when the phrase doesn’t match a regular expression. We should encourage this writing style.
Well the problem with that is, we’ll end up with links all over the place, instead of just the relevant bits. Problems with the regular expression method as described:
if you use a word, say “Newcomb’s”, a bunch of times, then you’ll have a bunch of links to the same thing.
many of the things we’d like to link to have other meanings, and so will be incorrectly linked; for example, ‘update’, or even “something to protect”.
While the heavy cross-linking in Eliezer’s posts is good to some extent, you don’t want relevant cross-links to be drowned out by automatic ones.
It should be fairly trivial to set the bot up to only link the first instance of each phrase.
These concerns seem like they would be a minor- or non-issue if the system had appropriate features: Writers should have a way of stopping the bot from linking a given instance of a phrase, in particular. This implies that they’d have to know which phrases were being autolinked, which seems like it’d be sensibly solved by having the autolinks show up when the article is previewed before being published.
Please don’t make automatic links opt-out—that’s just going to make it much, much worse for anyone with a slow connection.
Edit: I don’t think it’s a good idea in the first place, anyway—even ignoring that no possible autolinker would add the correct link in this comment, the server load would be severely increased by the change. Wasn’t someone saying that the system was suffering some strain?