I found it pretty annoying to having to search for links (especially for Wikipedia) when trying to cite stuff. So I wrote a plugin for the editor I use to easily insert lines from a file (in this case files with one link per line).
I don’t think many other people are using the same editor, but I wanted to report that this has made writing with a lot of links much easier.
What I did for Gwern.net is I wrote a script to daily parse the entire corpus for all links, extract the linked text as well as the URL, and then sort by frequency, do some filtering and white/blacklists, and spit the pairs out to a file; then my do-everything formatting command treats the file as a giant interactive-search-and-replace list, and I Y/N all the hits to turn them into links. (With various tweaks for smoother workflow omitted for brevity.) It typically takes like 10-20 zoned-out seconds per annotation, as there will usually only be <20 hits total. The benefit here is that if I add links by had (because they weren’t added by the search-and-replace), they will show up in subsequent search-and-replaces.
This would be ridiculous overkill if one only had a few links, of course, but the list is ~23k entries long now, and it catches a remarkable number of link instances. The more links you add, the smarter the list gets… There is the occasional bit of maintenance like blacklisting too-vague phrases or being fooled by false positives (particularly common with Arxiv papers—did that ‘Wang et al 2023’ semi-auto-link go to the right Wang et al 2023...?), but it fills a big niche in between completely automated link rewrites (slow, dangerous, and thus rare), and the labor of adding each and every possible link yourself one by one (endless tedious toil which will burn you out).
A minor bottleneck I’ve recently solved:
I found it pretty annoying to having to search for links (especially for Wikipedia) when trying to cite stuff. So I wrote a plugin for the editor I use to easily insert lines from a file (in this case files with one link per line).
I don’t think many other people are using the same editor, but I wanted to report that this has made writing with a lot of links much easier.
What I did for Gwern.net is I wrote a script to daily parse the entire corpus for all links, extract the linked text as well as the URL, and then sort by frequency, do some filtering and white/blacklists, and spit the pairs out to a file; then my do-everything formatting command treats the file as a giant interactive-search-and-replace list, and I
Y
/N
all the hits to turn them into links. (With various tweaks for smoother workflow omitted for brevity.) It typically takes like 10-20 zoned-out seconds per annotation, as there will usually only be <20 hits total. The benefit here is that if I add links by had (because they weren’t added by the search-and-replace), they will show up in subsequent search-and-replaces.This would be ridiculous overkill if one only had a few links, of course, but the list is ~23k entries long now, and it catches a remarkable number of link instances. The more links you add, the smarter the list gets… There is the occasional bit of maintenance like blacklisting too-vague phrases or being fooled by false positives (particularly common with Arxiv papers—did that ‘Wang et al 2023’ semi-auto-link go to the right Wang et al 2023...?), but it fills a big niche in between completely automated link rewrites (slow, dangerous, and thus rare), and the labor of adding each and every possible link yourself one by one (endless tedious toil which will burn you out).