As your belief about how well AGI is likely to go affects both the likelihood of a bet being evaluated, and the chance of winning, so bets about AGI are likely to give dubious results. I also have substantial uncertainty about the value of money in a post singularity world. Most obviously is everyone getting turned into paperclips, noone has any use for money. If we get a friendly singleton super-intelligence, everyone is living in paradise, whether or not they had money before. If we get an economic singularity, where libertarian ASI(s) try to make money without cheating, then money could be valuable. I’m not sure how we would get that, as an understanding of the control problem good enough to not wipe out humans and fill the universe with bank notes should be enough to make something closer to friendly.
Even if we do get some kind of ascendant economy, given the amount of resources in the solar system (let alone wider universe), its quite possible that pocket change would be enough to live for aeons of luxury.
Given how unclear it is about whether or not the bet will get paid and how much the cash would be worth if it was, I doubt that the betting will produce good info. If everyone thinks that money is more likely than not to be useless to them after ASI, then almost no one will be prepared to lock their capital up until then in a bet.
I think the main crux here is how valuable money will be post-AGI. My impression is that it will still be quite valuable. Unless there is a substantial redistribution effort (which would have other issues), I imagine economic growth will make the rich more money than the poor. I’d also think that even though it would be “paradise”, many people would care about how many resources they have. Having one-millionth of all human resources may effectively give you access to one-millionth of everything produced by future AGIs.
Scenarios where AGI is friendly (not killing us) could be significantly more important to humans than ones in which it is not. Even if it has a 1% chance of being friendly, in that scenario, it’s possible we could be alive for a really long time.
Last, it may not have to be the case that everyone thinks money will be valuable post-AGI, but that some people with money think so. In those cases, they could exchange with others pre-AGI to take that specific risk.
So I generally agree there’s a lot of uncertainty, but think it’s less than you do. That said, this is, of course, something to apply predictions to.
As your belief about how well AGI is likely to go affects both the likelihood of a bet being evaluated, and the chance of winning, so bets about AGI are likely to give dubious results. I also have substantial uncertainty about the value of money in a post singularity world. Most obviously is everyone getting turned into paperclips, noone has any use for money. If we get a friendly singleton super-intelligence, everyone is living in paradise, whether or not they had money before. If we get an economic singularity, where libertarian ASI(s) try to make money without cheating, then money could be valuable. I’m not sure how we would get that, as an understanding of the control problem good enough to not wipe out humans and fill the universe with bank notes should be enough to make something closer to friendly.
Even if we do get some kind of ascendant economy, given the amount of resources in the solar system (let alone wider universe), its quite possible that pocket change would be enough to live for aeons of luxury.
Given how unclear it is about whether or not the bet will get paid and how much the cash would be worth if it was, I doubt that the betting will produce good info. If everyone thinks that money is more likely than not to be useless to them after ASI, then almost no one will be prepared to lock their capital up until then in a bet.
Thanks for the considered comment.
I think the main crux here is how valuable money will be post-AGI. My impression is that it will still be quite valuable. Unless there is a substantial redistribution effort (which would have other issues), I imagine economic growth will make the rich more money than the poor. I’d also think that even though it would be “paradise”, many people would care about how many resources they have. Having one-millionth of all human resources may effectively give you access to one-millionth of everything produced by future AGIs.
Scenarios where AGI is friendly (not killing us) could be significantly more important to humans than ones in which it is not. Even if it has a 1% chance of being friendly, in that scenario, it’s possible we could be alive for a really long time.
Last, it may not have to be the case that everyone thinks money will be valuable post-AGI, but that some people with money think so. In those cases, they could exchange with others pre-AGI to take that specific risk.
So I generally agree there’s a lot of uncertainty, but think it’s less than you do. That said, this is, of course, something to apply predictions to.