Call for Papers on AI/robot safety
The open-access Journal of Robotics has posted the Call for Papers for an upcoming “Special Issue on Robotic Safety & Security.”
One of the guest editors for this issue is Roman Yampolskiy, a past SI visiting fellow and author or co-author of several papers on AI risk: Leakproofing the Singularity, Artificial General Intelligence and the Human Mental Model, and Safety Engineering for Artificial General Intelligence.
Check the PDF for full details, but:
Manuscripts due June 8, 2012
Reviews on August 31, 2012
Will be published October 26, 2012
Read the author guidelines
Because Journal of Robotics is an open access journal, it charges an “article processing fee” of $500 to cover its costs (details). You are only charged if your submissions is accepted and printed by the journal.
Update: The Singularity Institute will reimburse you for your article processing fee if we think the article you’re submitting is worthwhile. Contact luke [at] singularity.org for details.
Do you know if undergraduates are allowed to submit?
I’ve never seen any type of restriction on who can submit to a journal (have you?), so my confident answer is yes. Your life will be easier if you’re associated with some people who have academic-publishing experience, certainly.
Interesting business model, with the article processing fee. If your article is any good, just publish it online yourself, the monetarily-motivated evaluation doesn’t impress people too much. And the 500$ are best donated to actual charity.
Plenty of good open access journals and it’s a now standard business model, depending on field, and will have zero impact on how the article is perceived. The good PLoS or BMC journals, for example, will be as well regarded as any somewhat focussed journal. Likewise, if you pay the open access fee to a journal that doesn’t automatically require it, no one will imagine you’re bribing them or something ridiculous like that. This journal, in particular, is probably not a great idea (Hindawi) and the thought process hinted at (re: editor) may not be great.
It’s called monetizing. You get the perception of status somehow, you go public, you have shareholders, you are obligated to monetize this asset, aka sell this status for money, which devalues the status, but works for some time due to inertia. Paying people to grade the work—especially if they only get money if they grade the work as good—that’s selling the grades. Granted, an upper class restaurant may refuse to serve a drunk, and may have a dress code, but don’t mistake this for peer review. Conflicts between monetary and other interests are consistently resolved in favour of monetary interests.
I would say this exchange basically exemplifies why I don’t participate in Less Wrong.