Let me introduce myself: I’m Sean and I work as project manager at FHI (finally got around to registering!). In posts here I won’t be speaking on behalf of FHI unless I explicitly state so (although, like Stuart, I imagine I often will be). I’m not involved officially with CSER, but I’m in communication with them and hope to be keeping up to date with them over the coming months.
A few comments on your observations:
2) CSER have done a deliberate and well-orchestrated “media splash” campaign over the last week, but I believe they’re finished with this now. They’ve got some big names involved and a good support structure in place in Cambridge, which helps.
3) My understanding is that CSER hasn’t published anything yet because they don’t exist yet in a practical sense—they’ve been founded but nobody’s employed, and they’re still gathering seed funding.
4) The Sunday Times article’s a bit unfortunate and the general feeling at FHI is that we’re not too impressed by the journalist’s work, but please note that the more “controversial” statements are the journalist’s own thoughts (it’s not clear in all places if you skim the article like I did at first). CSER has some good people behind it, and at the time of writing the FHI plans to support it and collaborate with it where possible—we think it’s a very positive development in the field of Xrisk. Even the term getting out there is a positive!
If journalism demands that you stick to Hollywood references when communicating a concept, it wouldn’t be so bad if journalists managed to understand and convey the distinction between:
The wholly implausible, worse than useless Terminator humanoid hunter-killer robot scenario.
The not completely far-fetched Skynet launches every nuke, humanity dies scenario.
I think it works as a hierarchy of increasingly complex models. Readers will stop at whichever rung they are comfortable with depending on their curiosity and background.
My real life conversations on X-risk tend to go Terminator Drones Skynet Specialized AI General AI Friendly AI
Hi,
Let me introduce myself: I’m Sean and I work as project manager at FHI (finally got around to registering!). In posts here I won’t be speaking on behalf of FHI unless I explicitly state so (although, like Stuart, I imagine I often will be). I’m not involved officially with CSER, but I’m in communication with them and hope to be keeping up to date with them over the coming months.
A few comments on your observations:
2) CSER have done a deliberate and well-orchestrated “media splash” campaign over the last week, but I believe they’re finished with this now. They’ve got some big names involved and a good support structure in place in Cambridge, which helps.
3) My understanding is that CSER hasn’t published anything yet because they don’t exist yet in a practical sense—they’ve been founded but nobody’s employed, and they’re still gathering seed funding.
4) The Sunday Times article’s a bit unfortunate and the general feeling at FHI is that we’re not too impressed by the journalist’s work, but please note that the more “controversial” statements are the journalist’s own thoughts (it’s not clear in all places if you skim the article like I did at first). CSER has some good people behind it, and at the time of writing the FHI plans to support it and collaborate with it where possible—we think it’s a very positive development in the field of Xrisk. Even the term getting out there is a positive!
Welcome, and thanks for the comments.
Agreed.
If journalism demands that you stick to Hollywood references when communicating a concept,
it wouldn’t be so bad if journalists managed to understand and convey the distinction between:
The wholly implausible, worse than useless Terminator humanoid hunter-killer robot scenario.
The not completely far-fetched Skynet launches every nuke, humanity dies scenario.
I think it works as a hierarchy of increasingly complex models. Readers will stop at whichever rung they are comfortable with depending on their curiosity and background.
My real life conversations on X-risk tend to go
Terminator
Drones
Skynet
Specialized AI
General AI
Friendly AI