I feel pretty frustrated at how rarely people actually bet or make quantitative predictions about existential risk from AI.
I think that might be a result of how the topic is, well, just really fucking grim. I think part of what allows discussion of it and thought about it for a lot of people (including myself) is a certain amount of detachment. “AI doomers” get often accused of being LARPers or not taking their own ideas seriously because they don’t act like people who believe the world is ending in 10 years, but I’d flip that around—a person who believes the world is ending in 10 years probably acts absolutely insane, and so people to keep their maximum possible sanity establish a sort of barrier and discuss these things as they would a game or a really interesting scientific question. But actually placing a bet on it? Shorting your own future on the premise that you won’t have a future? That breaks the barrier, and it becomes just really uncomfortable. I know I’d still rather live as if I was dead wrong no matter how confident I am in being theoretically right. I wonder in fact whether this feeling was shared by e.g. game theorists working on nuclear strategy.
I think that might be a result of how the topic is, well, just really fucking grim. I think part of what allows discussion of it and thought about it for a lot of people (including myself) is a certain amount of detachment. “AI doomers” get often accused of being LARPers or not taking their own ideas seriously because they don’t act like people who believe the world is ending in 10 years, but I’d flip that around—a person who believes the world is ending in 10 years probably acts absolutely insane, and so people to keep their maximum possible sanity establish a sort of barrier and discuss these things as they would a game or a really interesting scientific question. But actually placing a bet on it? Shorting your own future on the premise that you won’t have a future? That breaks the barrier, and it becomes just really uncomfortable. I know I’d still rather live as if I was dead wrong no matter how confident I am in being theoretically right. I wonder in fact whether this feeling was shared by e.g. game theorists working on nuclear strategy.