These are good points. But it does seem like what @iceman meant by the bit that I quoted at least has connotations that go beyond your interpretation, yes?
Whether MIRI is a good place to donate is a very complicated question, but certainly “no” is a valid answer for many donors.
Sure. I haven’t donated to MIRI in many years, so I certainly wouldn’t tell anyone else to do so. (It’s not my understanding that MIRI is funding constrained at this time. Can anyone confirm or disconfirm this?)
What accusation do you see in the connotations of that quote? Genuine question, I could guess but I’d prefer to know. Mostly the subtext I see from iceman is disappointment and grief and anger and regret. Which are all valid emotions for them to feel.
I think a lot of what might have been serious accusations in 2019 are now common knowledge, eg after Bankless, Death with Dignity, etc.
(It’s not my understanding that MIRI is funding constrained at this time. Can anyone confirm or disconfirm this?)
From the Bankless interview:
How do I put it… The saner outfits do have uses for money. They don’t really have scalable uses for money, but they do burn any money literally at all. Like, if you gave MIRI a billion dollars, I would not know how to...
Well, at a billion dollars, I might try to bribe people to move out of AI development, that gets broadcast to the whole world, and move to the equivalent of an island somewhere—not even to make any kind of critical discovery, but just to remove them from the system. If I had a billion dollars.
If I just have another $50 million, I’m not quite sure what to do with that, but if you donate that to MIRI, then you at least have the assurance that we will not randomly spray money on looking like we’re doing stuff and we’ll reserve it, as we are doing with the last giant crypto donation somebody gave us until we can figure out something to do with it that is actually helpful. And MIRI has that property. I would say probably Redwood Research has that property.
So, just to clarify, “serious accusation” is not a phrase that I have written in this discussion prior to this comment, which is what the use of quotes in your comment suggests. I did write something which has more or less the same meaning! So you’re not mis-ascribing beliefs to me. But quotes mean that you’re… quoting… and that’s not the case here.
Anyway, on to the substance:
What “serious accusation” do you see in the connotations of that quote?
So Yudkowsky doesn’t have a workable alignment plan, so he decided to just live off our donations, running out the clock.
The connotations are that Eliezer has consciously chosen to stop working on alignment, while pretending to work on alignment, and receiving money to allegedly work on alignment but instead just not doing so, knowing that there won’t be any consequences for perpetrating this clear and obvious scam in the classic sense of the word, because the world’s going to end and he’ll never be held to account.
Needless to say, it just does not seem to me like Eliezer or MIRI are doing anything remotely like that. Indeed I don’t think anyone (serious) has even suggested that they’re doing anything like that. (The usual horde of haters on Twitter / Reddit / etc. notwithstanding.)
Mostly the subtext I see from iceman is disappointment and grief and anger and regret. Which are all valid emotions for them to feel.
But of course this is largely nonsensical in the absence of any “serious accusations”. Grief over what, anger about what? Why should these things be “valid emotions … to feel”? (And it can’t just be “we’re all going to die”, because that’s not new; we didn’t just find that out from the OP—while iceman’s comment clearly implies that whatever is the cause of his reaction, it’s something that he just learned from Zack’s post.)
I think a lot of what might have been “serious accusations” in 2019 are now common knowledge, eg after Bankless, Death with Dignity, etc.
Which is precisely why iceman’s comment does not make sense as a reply to this post, now; nor is the characterization which I quoted an accurate one.
(It’s not my understanding that MIRI is funding constrained at this time. Can anyone confirm or disconfirm this?)
From the Bankless interview:
Yep, I would describe that state of affairs as “not funding constrained”.
I think emotions are not blame assignment tools, and have other (evolutionary) purposes. A classic example is a relationship break-up, where two people can have strong emotions even though nobody did anything wrong. So I do not interpret emotions as accusations in general. It sounds like you have a different approach, and I don’t object to that.
Grief over what, anger about what?
For example, grief over the loss of the $100k+ donation. Donated with the hope that it would reduce extinction risk, but with the benefit of hindsight the donor now thinks that the marginal donation had no counterfactual impact. It’s not blameworthy because no researcher can possibly promise that a marginal donation will have a large counterfactual impact, and MIRI did not so promise. But a donor can still grieve the loss without someone being to blame.
For example, anger that Yudkowsky realized he had no workable alignment plan, in his estimation, in 2015 (Bankless), and didn’t share that until 2022 (Death with Dignity). This is not blameworthy because people are not morally obliged to share their extinction risk predictions, and MIRI has a clear policy against sharing information by default. But a donor can still be angry that they were disadvantaged by known unknowns.
I hope these examples illustrate that a non-accusatory interpretation is sensical, even if you don’t think it plausible.
There’s a later comment from iceman, which is probably the place to discuss what iceman is alleging:
What should MIRI have done, had they taken the good sliver of The Sequences to heart? They should have said oops. The should have halted, melted and caught fire. They should have acknowledged that the sky was blue. They should have radically changed their minds when the facts changed. But that would have cut off their funding. If the world isn’t going to end from a FOOMing AI, why should MIRI get paid?
I think emotions are not blame assignment tools, and have other (evolutionary) purposes. A classic example is a relationship break-up, where two people can have strong emotions even though nobody did anything wrong. So I do not interpret emotions as accusations in general. It sounds like you have a different approach, and I don’t object to that.
You misunderstand. I’m not “interpret[ing] emotions as accusations”; I’m simply saying that emotions don’t generally arise for no reason at all (if they do, we consider that to be a pathology!).
So, in your break-up example, the two people involved of course have strong emotions—because of the break-up! On the other hand, it would be very strange indeed to wake up one day and have those same emotions, but without having broken up with anyone, or anything going wrong in your relationships at all.
And likewise, in this case:
Grief over what, anger about what?
For example, grief over the loss of the $100k+ donation. Donated with the hope that it would reduce extinction risk, but with the benefit of hindsight the donor now thinks that the marginal donation had no counterfactual impact. It’s not blameworthy because no researcher can possibly promise that a marginal donation will have a large counterfactual impact, and MIRI did not so promise. But a donor can still grieve the loss without someone being to blame.
Well, it’s bit dramatic to talk of “grief” over the loss of money, but let’s let that pass. More to the point: why is it a “loss”, suddenly? What’s happened just now that would cause iceman to view it as a “loss”? It’s got to be something in Zack’s post, or else the comment is weirdly non-apropos, right? In other words, the implication here is that something in the OP has caused iceman to re-examine the facts, and gain a new “benefit of hindsight”. But that’s just what I’m questioning.
For example, anger that Yudkowsky realized he had no workable alignment plan, in his estimation, in 2015 (Bankless), and didn’t share that until 2022 (Death with Dignity). This is not blameworthy because people are not morally obliged to share their extinction risk predictions, and MIRI has a clear policy against sharing information by default. But a donor can still be angry that they were disadvantaged by known unknowns.
I do not read Eliezer’s statements in the Bankless interview as saying that he “realized he had no workable alignment plan” in 2015. As far as I know, at no time since starting to write the Sequences has Eliezer ever claimed to have, or thought that he had, a workable alignment plan. This has never been a secret, nor is it news, either to Eliezer in 2015 or to the rest of us in 2022.
I hope these examples illustrate that a non-accusatory interpretation is sensical, even if you don’t think it plausible.
They do not.
There’s a later comment from iceman, which is probably the place to discuss what iceman is alleging:
These are good points. But it does seem like what @iceman meant by the bit that I quoted at least has connotations that go beyond your interpretation, yes?
Sure. I haven’t donated to MIRI in many years, so I certainly wouldn’t tell anyone else to do so. (It’s not my understanding that MIRI is funding constrained at this time. Can anyone confirm or disconfirm this?)
What accusation do you see in the connotations of that quote? Genuine question, I could guess but I’d prefer to know. Mostly the subtext I see from iceman is disappointment and grief and anger and regret. Which are all valid emotions for them to feel.
I think a lot of what might have been serious accusations in 2019 are now common knowledge, eg after Bankless, Death with Dignity, etc.
From the Bankless interview:
(Edited to fix misquote)
So, just to clarify, “serious accusation” is not a phrase that I have written in this discussion prior to this comment, which is what the use of quotes in your comment suggests. I did write something which has more or less the same meaning! So you’re not mis-ascribing beliefs to me. But quotes mean that you’re… quoting… and that’s not the case here.
Anyway, on to the substance:
And the quote in question, again, is:
The connotations are that Eliezer has consciously chosen to stop working on alignment, while pretending to work on alignment, and receiving money to allegedly work on alignment but instead just not doing so, knowing that there won’t be any consequences for perpetrating this clear and obvious scam in the classic sense of the word, because the world’s going to end and he’ll never be held to account.
Needless to say, it just does not seem to me like Eliezer or MIRI are doing anything remotely like that. Indeed I don’t think anyone (serious) has even suggested that they’re doing anything like that. (The usual horde of haters on Twitter / Reddit / etc. notwithstanding.)
But of course this is largely nonsensical in the absence of any “serious accusations”. Grief over what, anger about what? Why should these things be “valid emotions … to feel”? (And it can’t just be “we’re all going to die”, because that’s not new; we didn’t just find that out from the OP—while iceman’s comment clearly implies that whatever is the cause of his reaction, it’s something that he just learned from Zack’s post.)
Which is precisely why iceman’s comment does not make sense as a reply to this post, now; nor is the characterization which I quoted an accurate one.
Yep, I would describe that state of affairs as “not funding constrained”.
I edited out my misquote, my apologies.
I think emotions are not blame assignment tools, and have other (evolutionary) purposes. A classic example is a relationship break-up, where two people can have strong emotions even though nobody did anything wrong. So I do not interpret emotions as accusations in general. It sounds like you have a different approach, and I don’t object to that.
For example, grief over the loss of the $100k+ donation. Donated with the hope that it would reduce extinction risk, but with the benefit of hindsight the donor now thinks that the marginal donation had no counterfactual impact. It’s not blameworthy because no researcher can possibly promise that a marginal donation will have a large counterfactual impact, and MIRI did not so promise. But a donor can still grieve the loss without someone being to blame.
For example, anger that Yudkowsky realized he had no workable alignment plan, in his estimation, in 2015 (Bankless), and didn’t share that until 2022 (Death with Dignity). This is not blameworthy because people are not morally obliged to share their extinction risk predictions, and MIRI has a clear policy against sharing information by default. But a donor can still be angry that they were disadvantaged by known unknowns.
I hope these examples illustrate that a non-accusatory interpretation is sensical, even if you don’t think it plausible.
There’s a later comment from iceman, which is probably the place to discuss what iceman is alleging:
You misunderstand. I’m not “interpret[ing] emotions as accusations”; I’m simply saying that emotions don’t generally arise for no reason at all (if they do, we consider that to be a pathology!).
So, in your break-up example, the two people involved of course have strong emotions—because of the break-up! On the other hand, it would be very strange indeed to wake up one day and have those same emotions, but without having broken up with anyone, or anything going wrong in your relationships at all.
And likewise, in this case:
Well, it’s bit dramatic to talk of “grief” over the loss of money, but let’s let that pass. More to the point: why is it a “loss”, suddenly? What’s happened just now that would cause iceman to view it as a “loss”? It’s got to be something in Zack’s post, or else the comment is weirdly non-apropos, right? In other words, the implication here is that something in the OP has caused iceman to re-examine the facts, and gain a new “benefit of hindsight”. But that’s just what I’m questioning.
I do not read Eliezer’s statements in the Bankless interview as saying that he “realized he had no workable alignment plan” in 2015. As far as I know, at no time since starting to write the Sequences has Eliezer ever claimed to have, or thought that he had, a workable alignment plan. This has never been a secret, nor is it news, either to Eliezer in 2015 or to the rest of us in 2022.
They do not.
Well, you can see my response to that comment.