Agreed that status / perceived in-field expertise seems pretty important here, especially as seen through the qualitative results (though the Gates talk did surprisingly well, given not an AI researcher, but the content reflects that). We probably won’t have [energy / time / money] + [we have limited access to researchers] to test something like this, but I think we can hold “status is important” as something pretty true given these results, Hobbhann’s (https://forum.effectivealtruism.org/posts/kFufCHAmu7cwigH4B/lessons-learned-from-talking-to-greater-than-100-academics), and a ton of anecdotal evidence from a number of different sources.
(I also think the Sam Bowman article is a great article to recommend, and in fact recommend that first a lot of the time.)
Agreed that status / perceived in-field expertise seems pretty important here, especially as seen through the qualitative results (though the Gates talk did surprisingly well, given not an AI researcher, but the content reflects that). We probably won’t have [energy / time / money] + [we have limited access to researchers] to test something like this, but I think we can hold “status is important” as something pretty true given these results, Hobbhann’s (https://forum.effectivealtruism.org/posts/kFufCHAmu7cwigH4B/lessons-learned-from-talking-to-greater-than-100-academics), and a ton of anecdotal evidence from a number of different sources.
(I also think the Sam Bowman article is a great article to recommend, and in fact recommend that first a lot of the time.)