I think the Wait But Why posts are quite good, though I usually link them alongside Luke Muehlhauser’s reply.
For example, 0% was mentioned in a few places
It’s obviously not literally 0%, and the post is explicitly about ‘how do we succeed?’, with a lengthy discussion of the possible worlds where we do in fact succeed:
[...] The surviving worlds look like people who lived inside their awful reality and tried to shape up their impossible chances; until somehow, somewhere, a miracle appeared—the model broke in a positive direction, for once, as does not usually occur when you are trying to do something very difficult and hard to understand, but might still be so—and they were positioned with the resources and the sanity to take advantage of that positive miracle, because they went on living inside uncomfortable reality. Positive model violations do ever happen, but it’s much less likely that somebody’s specific desired miracle that “we’re all dead anyways if not...” will happen; these people have just walked out of the reality where any actual positive miracles might occur. [...]
The whole idea of ‘let’s maximize dignity’ is that it’s just a reframe of ‘let’s maximize the probability that we survive and produce a flourishing civilization’ (the goal of the reframe being to guard against wishful thinking):
Obviously, the measuring units of dignity are over humanity’s log odds of survival—the graph on which the logistic success curve is a straight line. [...] But if enough people can contribute enough bits of dignity like that, wouldn’t that mean we didn’t die at all? Yes, but again, don’t get your hopes up.
Hence:
Q1: Does ‘dying with dignity’ in this context mean accepting the certainty of your death, and not childishly regretting that or trying to fight a hopeless battle?
Don’t be ridiculous. How would that increase the log odds of Earth’s survival?
I dunno what the ‘0%’ means exactly, but it’s obviously not literal. My read of it was something like ‘low enough that it’s hard enough to be calibrated about exactly how low it is’, plus ‘low enough that you can make a lot of progress and still not have double-digit success probability’.
I think the Wait But Why posts are quite good, though I usually link them alongside Luke Muehlhauser’s reply.
Cool! Good to get your endorsement. And thanks for pointing me to Muehlhauser’s reply. I’ll check it out.
I dunno what the ‘0%’ means exactly, but it’s obviously not literal. My read of it was something like ‘low enough that it’s hard enough to be calibrated about exactly how low it is’, plus ‘low enough that you can make a lot of progress and still not have double-digit success probability’.
Ok, yeah, that does sound pretty plausible. It still encompasses a pretty wide range though. Like, it could mean one in a billion, I guess? There is a grim tone here. And Eliezer has spoken about his pessimism elsewhere with a similarly grim tone. Maybe I am overreacting to that. I dunno. Like Raemon, I am still feeling confused.
It’s definitely worth noting though that even at a low number like one in a billion, it is still worth working on for sure. And I see that Eliezer believes this as well. So in that sense I take back what I said in my initial comment.
One in a billion seems way, way, way too low to me. (Like, I think that’s a crazy p(win) to have, and I’d be shocked beyond shocked if Eliezer’s p(win) were that low. Like, if he told me that I don’t think I’d believe him.)
FYI I am finding myself fairly confused about the “0%” line. I don’t see a reason not to take Eliezer at his word that he meant 0%. “Obviously not literal” feels pretty strong, if he meant a different thing I’d prefer the post say whatever he meant.
Eliezer seemed quite clear to me when he said (paraphrased) “we are on the left side of the logistical success curve, where success is measured in significant digits of leading 0s you are removing from your probability of success”. The whole post seems to clearly imply that Eliezer thinks that marginal dignity is possible, which he defines as a unit of logistical movement on the probability of success. This clearly implies the probability is not literally 0, but it does clearly argue that the probability (on a linear scale) can be rounded to 0.
Mostly I had no idea if he meant like 0.1, 0.001, or 0.00000001. Also not sure if he’s more like “survival chance is 0%, probably, with some margin of error, maybe it’s 1%”, or “no, I’m making the confident claim that it’s more like 0.0001″
(This was combined with some confusion about Nate Soares saying something in the vein of “if you don’t whole heartedly believe in your plans, you should multiply their EV by 0, and you’re not supposed to pick the plan who’s epsilon value is “least epsilon”)
Also, MIRI isn’t (necessarily) a hive mind, so not sure if Rob, Nate or Abram actually share the same estimate of how doomed we are as Eliezer.
Also, MIRI isn’t (necessarily) a hive mind, so not sure if Rob, Nate or Abram actually share the same estimate of how doomed we are as Eliezer.
Indeed, I expect that the views of at least some individuals working at MIRI vary considerably.
In some ways, the post would seem more accurate to me if it had the Onion-esque headline: Eliezer announces on MIRI’s behalf that “MIRI adopts new ‘Death with Dignity’ strategy.”
Still, I love the post a lot. Also, Eliezer has always been pivotal in MIRI.
I think the Wait But Why posts are quite good, though I usually link them alongside Luke Muehlhauser’s reply.
It’s obviously not literally 0%, and the post is explicitly about ‘how do we succeed?’, with a lengthy discussion of the possible worlds where we do in fact succeed:
The whole idea of ‘let’s maximize dignity’ is that it’s just a reframe of ‘let’s maximize the probability that we survive and produce a flourishing civilization’ (the goal of the reframe being to guard against wishful thinking):
Hence:
I dunno what the ‘0%’ means exactly, but it’s obviously not literal. My read of it was something like ‘low enough that it’s hard enough to be calibrated about exactly how low it is’, plus ‘low enough that you can make a lot of progress and still not have double-digit success probability’.
Cool! Good to get your endorsement. And thanks for pointing me to Muehlhauser’s reply. I’ll check it out.
Ok, yeah, that does sound pretty plausible. It still encompasses a pretty wide range though. Like, it could mean one in a billion, I guess? There is a grim tone here. And Eliezer has spoken about his pessimism elsewhere with a similarly grim tone. Maybe I am overreacting to that. I dunno. Like Raemon, I am still feeling confused.
It’s definitely worth noting though that even at a low number like one in a billion, it is still worth working on for sure. And I see that Eliezer believes this as well. So in that sense I take back what I said in my initial comment.
One in a billion seems way, way, way too low to me. (Like, I think that’s a crazy p(win) to have, and I’d be shocked beyond shocked if Eliezer’s p(win) were that low. Like, if he told me that I don’t think I’d believe him.)
Ok, that’s good to hear! Just checking, do you feel similarly about 1 in 100k?
No, that’s a lot lower than my probability but it doesn’t trigger the same ‘that can’t be right, we must be miscommunicating somehow’ reaction.
I see. Thanks for the response.
FYI I am finding myself fairly confused about the “0%” line. I don’t see a reason not to take Eliezer at his word that he meant 0%. “Obviously not literal” feels pretty strong, if he meant a different thing I’d prefer the post say whatever he meant.
Eliezer seemed quite clear to me when he said (paraphrased) “we are on the left side of the logistical success curve, where success is measured in significant digits of leading 0s you are removing from your probability of success”. The whole post seems to clearly imply that Eliezer thinks that marginal dignity is possible, which he defines as a unit of logistical movement on the probability of success. This clearly implies the probability is not literally 0, but it does clearly argue that the probability (on a linear scale) can be rounded to 0.
Mostly I had no idea if he meant like 0.1, 0.001, or 0.00000001. Also not sure if he’s more like “survival chance is 0%, probably, with some margin of error, maybe it’s 1%”, or “no, I’m making the confident claim that it’s more like 0.0001″
(This was combined with some confusion about Nate Soares saying something in the vein of “if you don’t whole heartedly believe in your plans, you should multiply their EV by 0, and you’re not supposed to pick the plan who’s epsilon value is “least epsilon”)
Also, MIRI isn’t (necessarily) a hive mind, so not sure if Rob, Nate or Abram actually share the same estimate of how doomed we are as Eliezer.
Indeed, I expect that the views of at least some individuals working at MIRI vary considerably.
In some ways, the post would seem more accurate to me if it had the Onion-esque headline: Eliezer announces on MIRI’s behalf that “MIRI adopts new ‘Death with Dignity’ strategy.”
Still, I love the post a lot. Also, Eliezer has always been pivotal in MIRI.
The five MIRI responses in my AI x-risk survey (marked with orange dots) show a lot of variation in P(doom):
(Albeit it’s still only five people; maybe a lot of MIRI optimists didn’t reply, or maybe a lot of pessimists didn’t, for some reason.)
Personally, I took it to be 0% within an implied # of significant digits, perhaps in the ballpark of three.