If you think of it less like “possibly having a lot of money post-AGI” and more like “possibly owning a share of whatever the AGIs produce post-AGI”, then I can imagine scenarios where that’s very good and important. It wouldn’t matter in the worst scenarios or best scenarios, but it might matter in some in-between scenarios, I guess. Hard to say though …
This is a good point, but even taking it into account I think my overall claim still stands. The scenarios where it’s very important to own a larger share of the AGI-produced pie [ETA: via the mechanism of pre-existing stock ownership] are pretty unlikely IMO compared to e.g. scenarios where we all die or where all humans are given equal consideration regardless of how much stock they own, and then (separate point) also our money will probably have been better spent prior to AGI trying to improve the probability of AI going well than waiting till after AI to do stuff with the spoils.
If you think of it less like “possibly having a lot of money post-AGI” and more like “possibly owning a share of whatever the AGIs produce post-AGI”, then I can imagine scenarios where that’s very good and important. It wouldn’t matter in the worst scenarios or best scenarios, but it might matter in some in-between scenarios, I guess. Hard to say though …
This is a good point, but even taking it into account I think my overall claim still stands. The scenarios where it’s very important to own a larger share of the AGI-produced pie [ETA: via the mechanism of pre-existing stock ownership] are pretty unlikely IMO compared to e.g. scenarios where we all die or where all humans are given equal consideration regardless of how much stock they own, and then (separate point) also our money will probably have been better spent prior to AGI trying to improve the probability of AI going well than waiting till after AI to do stuff with the spoils.