Has Tyler Cowen heard of the Bio Anchors by Ajeya Cotra model or the takeoffspeeds.com model by Tom Davidson or Roodman’s model of the singularity, or for that matter the earlier automation models by Robin Hanson? All of them seem to be the sort of thing he wants, I’m surprised he hasn’t heard of them. Or maybe he has and thinks they don’t count for some reason? I would be curious to know why.
They say “And then the entire world gets transformed as superintelligent AIs + robots automate the economy.” Does Tyler Cowen buy all of that? Is that not the part he disagrees with?
And then yeah for the AI kills you part there are models as well, albeit not economic growth models because economic growth is a different subject. But there are simple game theory models, for example—expected utility maximizer with mature technology + misaligned utility function = and then it kills you. And then there are things like Carlsmith’s six-step argument and Chalmers’ and so forth. What sort of thing does Tyler want, that’s different in kind from what we already have?
Has Tyler Cowen heard of the Bio Anchors by Ajeya Cotra model or the takeoffspeeds.com model by Tom Davidson or Roodman’s model of the singularity, or for that matter the earlier automation models by Robin Hanson? All of them seem to be the sort of thing he wants, I’m surprised he hasn’t heard of them. Or maybe he has and thinks they don’t count for some reason? I would be curious to know why.
I think those don’t say ‘and then the AI kills you’
They say “And then the entire world gets transformed as superintelligent AIs + robots automate the economy.” Does Tyler Cowen buy all of that? Is that not the part he disagrees with?
And then yeah for the AI kills you part there are models as well, albeit not economic growth models because economic growth is a different subject. But there are simple game theory models, for example—expected utility maximizer with mature technology + misaligned utility function = and then it kills you. And then there are things like Carlsmith’s six-step argument and Chalmers’ and so forth. What sort of thing does Tyler want, that’s different in kind from what we already have?