An idea for having more AI Alignment peer review without compromising academic careers or reputation: create a review system in the Alignment Forum. What I had in mind is that people who are okay with doing a review can sign up somewhere. Then someone who posted something and wants a review can use a token (if they have some, which happens in a way I explain below) to ask for one. Then some people (maybe AF admins, maybe some specific administrator of the thing) assign one of the reviewers to the post.
The review has to follow some guidelines, like summarizing the paper, explaining the good parts and the issues, proposing new ideas. Once the review is posted and validated by the people in charge of the system, the reviewer gets a token she can use for asking a review of her own posts.
How do you bootstrap? For long time users of the AF it makes sense to give them some initial tokens maybe. And for newcomers (who really have a lot to win for reviews), I was thinking of asking them to do a nice distillation post for some token.
While not as ambitious as a journal, I think a system like that might solve two problems at once:
The lack of public feedback and in-depth peer review in most posts here
The lack of feedback at all for newcomers who don’t have private gdocs with a lot of researchers on them.
There’s probably a way to be even better for the second point, by for example having personal mentorship for something like three tokens.
I also believe that the incentives would be such that people would participate, and not necessarily try to game the system (being limited to the AF which is a small community also helps).
The lack of public feedback and in-depth peer review in most posts here
The lack of feedback at all for newcomers [...]
I think you need to distinguish clearly between wanting more peer interaction/feedback and wanting more peer review.
Academic peer review is a form of feedback, but it is mainly a form of quality control, so the scope of the feedback tends to be very limited in my experience.
The most valuable feedback, in terms of advancing the field, is comments like ‘maybe if you combine your X with this Y, then something very new/even better will come out’. This type of feedback can happen in private gdocs or LW/AF comment sections, less so in formal peer review.
That being said, I don’t think that private gdocs or LW/AF comment sections are optimal peer interaction/feedback mechanisms, something better might be designed. (The usual offline solution is to put a bunch of people together in the same building, either permanently or at a conference, and have many coffee breaks. Creating the same dynamics online is difficult.)
To make this more specific, here is what stops me usually from contributing feedback in AF comment sections. The way I do research, I tend to go on for months without reading any AF posts, as this would distract me too much. When I catch up, I have little motivation to add a quick or detailed comment to a 2-month old post.
An idea for having more AI Alignment peer review without compromising academic careers or reputation: create a review system in the Alignment Forum. What I had in mind is that people who are okay with doing a review can sign up somewhere. Then someone who posted something and wants a review can use a token (if they have some, which happens in a way I explain below) to ask for one. Then some people (maybe AF admins, maybe some specific administrator of the thing) assign one of the reviewers to the post.
The review has to follow some guidelines, like summarizing the paper, explaining the good parts and the issues, proposing new ideas. Once the review is posted and validated by the people in charge of the system, the reviewer gets a token she can use for asking a review of her own posts.
How do you bootstrap? For long time users of the AF it makes sense to give them some initial tokens maybe. And for newcomers (who really have a lot to win for reviews), I was thinking of asking them to do a nice distillation post for some token.
While not as ambitious as a journal, I think a system like that might solve two problems at once:
The lack of public feedback and in-depth peer review in most posts here
The lack of feedback at all for newcomers who don’t have private gdocs with a lot of researchers on them.
There’s probably a way to be even better for the second point, by for example having personal mentorship for something like three tokens.
I also believe that the incentives would be such that people would participate, and not necessarily try to game the system (being limited to the AF which is a small community also helps).
What do you think?
I think you need to distinguish clearly between wanting more peer interaction/feedback and wanting more peer review.
Academic peer review is a form of feedback, but it is mainly a form of quality control, so the scope of the feedback tends to be very limited in my experience.
The most valuable feedback, in terms of advancing the field, is comments like ‘maybe if you combine your X with this Y, then something very new/even better will come out’. This type of feedback can happen in private gdocs or LW/AF comment sections, less so in formal peer review.
That being said, I don’t think that private gdocs or LW/AF comment sections are optimal peer interaction/feedback mechanisms, something better might be designed. (The usual offline solution is to put a bunch of people together in the same building, either permanently or at a conference, and have many coffee breaks. Creating the same dynamics online is difficult.)
To make this more specific, here is what stops me usually from contributing feedback in AF comment sections. The way I do research, I tend to go on for months without reading any AF posts, as this would distract me too much. When I catch up, I have little motivation to add a quick or detailed comment to a 2-month old post.
One alternative would be to try to raise funds (e.g. perhaps from the EA LTF fund) to pay reviewers to perform reviews.