I do not think any conventional threat such as nuclear war, super pandemic or climate change is likely to be an ER
Are you including risks from advanced biotechnology in that category? To me, it would seem odd to call that a “conventional threat”; that category sounds to me like it would refer to things we have a decent amount of understanding of and experience with. (Really this is more of a spectrum, and our understanding of and experience with risks from nuclear war and climate change is of course limited in key ways as well. But I’d say it’s notably less limited than is the case with advanced biotech or advanced AI.)
with the last <1% being from more unusual threats such as simulation being turned off, false vacuum collapse, or hostile alien ASI. But also, for unforeseen or unimagined threats.
It appears to me that there are some important risks that have been foreseen and imagined which you’re not accounting for. Let me know if you want me to say more; I hesitate merely because I’m wary of pulling independent views towards community views in a thread like this, not for infohazard reasons (the things I have in mind are widely discussed and non-exotic).
Note: I made this prediction before looking at the Effective Altruism Database of Existential Risk Estimates.
I think it’s cool that you made this explicit, to inform how and how much people update on your views if they’ve already updated on views in that database :)
I’m not including advanced biotech in my conventional threat category; I really should have elaborated more on what I meant: Conventional risks are events that already have a background chance of happening (as of 2020 or so) and does not include future technologies.
I make the distinction because I think that we don’t have enough time left before ASI to develop such advanced tech ourselves, so as an ASI would be overseeing their development and deployment, which reduces their threat massively I think (Even if used by a rouge AI I would say the ER was from the AI not the tech). And that time limit goes not just for tech development but also runaway optimisation processes and societal forces (IE in-optimal value lock in), as a friendly ASI should have enough power to bring them to heel.
My list of threats wasn’t all inclusive, I paid lip service to some advanced tech and some of the more unusual scenarios, but generally I just thought past ASI nearly nothing would pose a real threat so didn’t focus on it. I am going read through the database of existential threats though, does it include what you were referring too? (“important risks that have been foreseen and imagined which you’re not accounting for”).
Conventional risks are events that already have a background chance of happening (as of 2020 or so) and does not include future technologies.
Yeah, that aligns with how I’d interpret the term. I asked about advanced biotech because I noticed it was absent from your answer unless it was included in “super pandemic”, so I was wondering whether you were counting it as a conventional risk (which seemed odd) or excluding it from your analysis (which also seems odd to me, personally, but at least now I understand your short-AI-timelines-based reasoning for that!).
I am going read through the database of existential threats though, does it include what you were referring too?
Yeah, I think all the things I’d consider most important are in there. Or at least “most”—I’d have to think for longer in order to be sure about “all”.
There are scenarios that I think aren’t explicitly addressed in any estimates that database, like things to do with whole-brain emulation or brain-computer interfaces, but these are arguably covered by other estimates. (I also don’t have a strong view on how important WBE or BCI scenarios are.)
Are you including risks from advanced biotechnology in that category? To me, it would seem odd to call that a “conventional threat”; that category sounds to me like it would refer to things we have a decent amount of understanding of and experience with. (Really this is more of a spectrum, and our understanding of and experience with risks from nuclear war and climate change is of course limited in key ways as well. But I’d say it’s notably less limited than is the case with advanced biotech or advanced AI.)
It appears to me that there are some important risks that have been foreseen and imagined which you’re not accounting for. Let me know if you want me to say more; I hesitate merely because I’m wary of pulling independent views towards community views in a thread like this, not for infohazard reasons (the things I have in mind are widely discussed and non-exotic).
I think it’s cool that you made this explicit, to inform how and how much people update on your views if they’ve already updated on views in that database :)
I’m not including advanced biotech in my conventional threat category; I really should have elaborated more on what I meant: Conventional risks are events that already have a background chance of happening (as of 2020 or so) and does not include future technologies.
I make the distinction because I think that we don’t have enough time left before ASI to develop such advanced tech ourselves, so as an ASI would be overseeing their development and deployment, which reduces their threat massively I think (Even if used by a rouge AI I would say the ER was from the AI not the tech). And that time limit goes not just for tech development but also runaway optimisation processes and societal forces (IE in-optimal value lock in), as a friendly ASI should have enough power to bring them to heel.
My list of threats wasn’t all inclusive, I paid lip service to some advanced tech and some of the more unusual scenarios, but generally I just thought past ASI nearly nothing would pose a real threat so didn’t focus on it. I am going read through the database of existential threats though, does it include what you were referring too? (“important risks that have been foreseen and imagined which you’re not accounting for”).
Thanks for the feedback :)
Yeah, that aligns with how I’d interpret the term. I asked about advanced biotech because I noticed it was absent from your answer unless it was included in “super pandemic”, so I was wondering whether you were counting it as a conventional risk (which seemed odd) or excluding it from your analysis (which also seems odd to me, personally, but at least now I understand your short-AI-timelines-based reasoning for that!).
Yeah, I think all the things I’d consider most important are in there. Or at least “most”—I’d have to think for longer in order to be sure about “all”.
There are scenarios that I think aren’t explicitly addressed in any estimates that database, like things to do with whole-brain emulation or brain-computer interfaces, but these are arguably covered by other estimates. (I also don’t have a strong view on how important WBE or BCI scenarios are.)