(I work at Anthropic.) My read of the “touch grass” comment is informed a lot by the very next sentences in the essay:
But more importantly, tame is good from a societal perspective. I think there’s only so much change people can handle at once, and the pace I’m describing is probably close to the limits of what society can absorb without extreme turbulence.
which I read as saying something like “It’s plausible that things could go much faster than this, but as a prediction about what will actually happen, humanity as a whole probably doesn’t want things to get incredibly crazy so fast, and so we’re likely to see something tamer.” I basically agree with that.
Do Anthropic employees who think less tame outcomes are plausible believe Dario when he says they should “touch grass”?
FWIW, I don’t read the footnote as saying “if you think crazier stuff is possible, touch grass”—I read it as saying “if you think the stuff in this essay is ‘tame’, touch grass”. The stuff in this essay is in fact pretty wild!
That said, I think I have historically underrated questions of how fast things will go given realistic human preferences about the pace of change, and that I might well have updated more in the above direction if I’d chatted with ordinary people about what they want out of the future, so “I needed to touch grass” isn’t a terrible summary. But IMO believing “really crazy scenarios are plausible on short timescales and likely on long timescales” is basically the correct opinion, and to the extent the essay can be read as casting shade on such views it’s wrong to do so. I would have worded this bit of the essay differently.
Re: honesty and signaling, I think it’s true that this essay’s intended audience is not really the crowd that’s already gamed out Mercury disassembly timelines, and its focus is on getting people up to shock level 2 or so rather than SL4, but as far as I know everything in it is an honest reflection of what Dario believes. (I don’t claim any special insight into Dario’s opinions here, just asserting that nothing I’ve seen internally feels in tension with this essay.) Like, it isn’t going out of its way to talk about the crazy stuff, but I don’t read that omission as dishonest.
For my own part:
I think it’s likely that we’ll get nanotech, von Neumann probes, Dyson spheres, computronium planets, acausal trade, etc in the event of aligned AGI.
Whether that stuff happens within the 5-10y timeframe of the essay is much less obvious to me—I’d put it around 30-40% odds conditional on powerful AI from roughly the current paradigm, maybe?
In the other 60-70% of worlds, I think this essay does a fairly good job of describing my 80th percentile expectations (by quality-of-outcome rather than by amount-of-progress).
I would guess that I’m somewhat more Dyson-sphere-pilled than Dario.
I’d be pretty excited to see competing forecasts for what good futures might look like! I found this essay helpful for getting more concrete about my own expectations, and many of my beliefs about good futures look like “X is probably physically possible; X is probably usable-for-good by a powerful civilization; therefore probably we’ll see some X” rather than having any kind of clear narrative about how the path to that point looks.
(I work at Anthropic.) My read of the “touch grass” comment is informed a lot by the very next sentences in the essay:
which I read as saying something like “It’s plausible that things could go much faster than this, but as a prediction about what will actually happen, humanity as a whole probably doesn’t want things to get incredibly crazy so fast, and so we’re likely to see something tamer.” I basically agree with that.
FWIW, I don’t read the footnote as saying “if you think crazier stuff is possible, touch grass”—I read it as saying “if you think the stuff in this essay is ‘tame’, touch grass”. The stuff in this essay is in fact pretty wild!
That said, I think I have historically underrated questions of how fast things will go given realistic human preferences about the pace of change, and that I might well have updated more in the above direction if I’d chatted with ordinary people about what they want out of the future, so “I needed to touch grass” isn’t a terrible summary. But IMO believing “really crazy scenarios are plausible on short timescales and likely on long timescales” is basically the correct opinion, and to the extent the essay can be read as casting shade on such views it’s wrong to do so. I would have worded this bit of the essay differently.
Re: honesty and signaling, I think it’s true that this essay’s intended audience is not really the crowd that’s already gamed out Mercury disassembly timelines, and its focus is on getting people up to shock level 2 or so rather than SL4, but as far as I know everything in it is an honest reflection of what Dario believes. (I don’t claim any special insight into Dario’s opinions here, just asserting that nothing I’ve seen internally feels in tension with this essay.) Like, it isn’t going out of its way to talk about the crazy stuff, but I don’t read that omission as dishonest.
For my own part:
I think it’s likely that we’ll get nanotech, von Neumann probes, Dyson spheres, computronium planets, acausal trade, etc in the event of aligned AGI.
Whether that stuff happens within the 5-10y timeframe of the essay is much less obvious to me—I’d put it around 30-40% odds conditional on powerful AI from roughly the current paradigm, maybe?
In the other 60-70% of worlds, I think this essay does a fairly good job of describing my 80th percentile expectations (by quality-of-outcome rather than by amount-of-progress).
I would guess that I’m somewhat more Dyson-sphere-pilled than Dario.
I’d be pretty excited to see competing forecasts for what good futures might look like! I found this essay helpful for getting more concrete about my own expectations, and many of my beliefs about good futures look like “X is probably physically possible; X is probably usable-for-good by a powerful civilization; therefore probably we’ll see some X” rather than having any kind of clear narrative about how the path to that point looks.