Yes, I do. I agree with Eliezer and Nate that the work MIRI was previously funding likely won’t yield many useful results, but I don’t think its correct to generalize to all agent foundations everywhere. Eg I’m bullish on natural abstractions, singular learning theory, comp mech, incomplete preferences, etc. None of which (except natural abstractions) was on Eliezer or Nate’s radar to my knowledge.
In the future I’d also recommend actually arguing for the position you’re trying to take, instead of citing an org you trust. You should probably trust Eliezer, Nate, and MIRI far less than you do, if you’re unable to argue for their position without reference to the org itself. In this circumstance I can see where MIRI is coming from, so its no problem on my end. But if I didn’t know where MIRI was coming from, I would be pretty annoyed. I also expect my comment here won’t change your mind too much, since you probably have a different idea of where MIRI is coming from, and your crux may not be any object level point, but the meta level point about how good Eliezer & Nate’s ability to judge research directions is, determining how much you defer to them & MIRI.
Yes, I do. I agree with Eliezer and Nate that the work MIRI was previously funding likely won’t yield many useful results, but I don’t think its correct to generalize to all agent foundations everywhere. Eg I’m bullish on natural abstractions, singular learning theory, comp mech, incomplete preferences, etc. None of which (except natural abstractions) was on Eliezer or Nate’s radar to my knowledge.
In the future I’d also recommend actually arguing for the position you’re trying to take, instead of citing an org you trust. You should probably trust Eliezer, Nate, and MIRI far less than you do, if you’re unable to argue for their position without reference to the org itself. In this circumstance I can see where MIRI is coming from, so its no problem on my end. But if I didn’t know where MIRI was coming from, I would be pretty annoyed. I also expect my comment here won’t change your mind too much, since you probably have a different idea of where MIRI is coming from, and your crux may not be any object level point, but the meta level point about how good Eliezer & Nate’s ability to judge research directions is, determining how much you defer to them & MIRI.