Many people are likely stumble across the Wikipedia entry for topics of interest relevant to those of us who frequent LessWrong: rationality, artificial intelligence, existential risks, decision theory, etc. These pages often shape one’s initial impressions of how interesting, important, or even credible a given topic is, and may have the potential to direct people towards productive resources (reading material, organizations like CFAR, notable figures such as Eliezer, etc.). As a result, ensuring that the Wikipedia entries on these topics are of better quality than some of them presently are presents an opportunity for investing relatively little effort in an activity with potentially substantial payoffs relative to the cost of time and effort put in.
I have already decided to improve some of the pages, beginning with the rather sloppy page that’s currently serving as the entry for existential risks, though of course others are welcome to contribute and may be more suited to the task than I am:
If you look at the section on risks posed by AI, for instance, it’s notably inadequate, while the page includes a bizarre section referencing Mayan doomsday forecasts and Newton’s predictions about the end of the world, neither of which seem adequately distinguished from rigorous attempts to actually assess legitimate existential risks.
I’m also constructing a list of other pages that are or are potentially in need of updating it and organizing it by my rough estimates of their relative importance (which I’m happy to share, modify, or discuss).
Turning this into a collaborative effort would be far more effective than doing it myself. If you think this is a worthwhile project and want to get involved I’d definitely like to hear from you and figure out a way to best coordinate our efforts.
Low-hanging fruit: improving wikipedia entries
Many people are likely stumble across the Wikipedia entry for topics of interest relevant to those of us who frequent LessWrong: rationality, artificial intelligence, existential risks, decision theory, etc. These pages often shape one’s initial impressions of how interesting, important, or even credible a given topic is, and may have the potential to direct people towards productive resources (reading material, organizations like CFAR, notable figures such as Eliezer, etc.). As a result, ensuring that the Wikipedia entries on these topics are of better quality than some of them presently are presents an opportunity for investing relatively little effort in an activity with potentially substantial payoffs relative to the cost of time and effort put in.
I have already decided to improve some of the pages, beginning with the rather sloppy page that’s currently serving as the entry for existential risks, though of course others are welcome to contribute and may be more suited to the task than I am:
https://en.wikipedia.org/wiki/Risks_to_civilization,_humans,_and_planet_Earth
If you look at the section on risks posed by AI, for instance, it’s notably inadequate, while the page includes a bizarre section referencing Mayan doomsday forecasts and Newton’s predictions about the end of the world, neither of which seem adequately distinguished from rigorous attempts to actually assess legitimate existential risks.
I’m also constructing a list of other pages that are or are potentially in need of updating it and organizing it by my rough estimates of their relative importance (which I’m happy to share, modify, or discuss).
Turning this into a collaborative effort would be far more effective than doing it myself. If you think this is a worthwhile project and want to get involved I’d definitely like to hear from you and figure out a way to best coordinate our efforts.