I have skimmed the Alignment Forum side and read most of MIRI’s work before 2015. While it’s hard to know about the “majority of people,” it does seem that the public reporting is around two polarized camps. However in this particular case, I don’t think it’s just the media. The public figures for both sides (EY and Yann Lecunn) seem pretty consistent with their messaging and talking past each other.
Also if the majority of people in the field agree with the above, that’s great news and also means that reasonable centrism needs to be more prominently signal-boosted.
On a more object level, as I linked in the post, I think the Alignment forum is pretty confused about value learning and the general promise of IRL to solve it.
The public figures are drawn from the most extreme positions. And Yudkowsky founded this field, so he’s also legitimately the most desired speaker. But things have changed a lot since 2015.
Check out Paul Christiano, Alex Turner, and Steve Byrnes for different views that are neither doomer nor foomer.
I don’t have a survey result handy, but the ones I vaguely remember put the p(doom) estimates from within the field at vastly lower than MIRI’s 90%+.
I am also familiar with Paul Christiano, I think his arguments for slower, more continous take off are broadly on the right track as well.
Given that the extreme positions have strong stake-outs on twitter, I am once again claiming that there needs to be a strong stake-out of the more reasonable centrism. This isn’t the first post in this direction, there were ones before and there will be ones after.
twitter is a toxicity machine and as a result I suspect that people who are much at all +reasonableness are avoiding it—certainly that’s why I don’t post much and try to avoid reading from my main feed, despite abstractly agreeing with you. that said, here’s me, if it helps at all: https://twitter.com/lauren07102
I have skimmed the Alignment Forum side and read most of MIRI’s work before 2015. While it’s hard to know about the “majority of people,” it does seem that the public reporting is around two polarized camps. However in this particular case, I don’t think it’s just the media. The public figures for both sides (EY and Yann Lecunn) seem pretty consistent with their messaging and talking past each other.
Also if the majority of people in the field agree with the above, that’s great news and also means that reasonable centrism needs to be more prominently signal-boosted.
On a more object level, as I linked in the post, I think the Alignment forum is pretty confused about value learning and the general promise of IRL to solve it.
this seems like a major concern in and of itself.
The public figures are drawn from the most extreme positions. And Yudkowsky founded this field, so he’s also legitimately the most desired speaker. But things have changed a lot since 2015.
Check out Paul Christiano, Alex Turner, and Steve Byrnes for different views that are neither doomer nor foomer.
I don’t have a survey result handy, but the ones I vaguely remember put the p(doom) estimates from within the field at vastly lower than MIRI’s 90%+.
I am also familiar with Paul Christiano, I think his arguments for slower, more continous take off are broadly on the right track as well.
Given that the extreme positions have strong stake-outs on twitter, I am once again claiming that there needs to be a strong stake-out of the more reasonable centrism. This isn’t the first post in this direction, there were ones before and there will be ones after.
Just trying to keep this particular ball rolling.
twitter is a toxicity machine and as a result I suspect that people who are much at all +reasonableness are avoiding it—certainly that’s why I don’t post much and try to avoid reading from my main feed, despite abstractly agreeing with you. that said, here’s me, if it helps at all: https://twitter.com/lauren07102
Yes, I agree. I have ideas how to fix it as well, but I seriously doubt they will gain much traction