(Responding to the object-level-question, which is the least interesting part of this but which is easiest to get started thinking about)
How much money would you ask for, if you and I were both given this offer: “Each of you name an amount of money without communicating until both numbers are known. If you both ask for the same amount, you both get that amount. Otherwise you get nothing. You have 1 minute to decide.”?
Now would be a good time to pause in reading, and actually decide.
(spoilers)
...
I initially parsed the question as “assume a random person” and picked “a million.” Upon re-read, I see that it specifies Ziz in particular, where a Trillion does seem a like a derivable schelling number, especially if you know Ziz.
(I think the strategically correct answer might still be a trillion for Expected Value reasons, but probably depends a lot on the rest of the “implied world” that the question suggests. A near-guarantee of “never have to work again” is still pretty good. If I’m unlikely to get weird mysterious offers like this again and if I don’t expect other agents similar to me to have the option of gaining a trillion dollars, I’m not sure how to think about it.)
I asked a couple LW-readers who answered “a million” and “a billion.”
A thing that affects my judgment slightly is that “a billion” is the highest number that’s in the zeitgeist as “actual money.” (A trillion is a “real” number, unlike a quadrillion, but it’s not a “real” amount of money). This, coupled with the fact that the billionaires I know of don’t seem to even know what to do with their money to cause good things to happen suggests it might actually be a derivably better answer than a trillion, unless you are knowably talking to extreme maximizers.
I think it’s 1 Trillion. If you want to match most often the answer is obviously 1-Million as the ‘a lot of money’ standard answer. Once you’re willing to ask for more than that, 1 Billion is in a strange place where it’s obviously a lot but also obviously ‘not enough’ in the sense that there are many who would be richer than you, whereas 1 Trillion is clearly ‘enough’ and the only plausible point above a billion, so I don’t think it’s that much less likely once you know the person knows roughly the orders of magnitude involved and the word “trillion”. If the person really is random then 1 Billion becomes more reasonable, but I think it has to be ‘average American’ level random.
I haven’t read the second article yet, so I am not sure where this all goes. But I am thinking about “why our kind can’t cooperate?” and this feels like a possible approach to solve that problem—instead of trying to achieve coordination by talking (which would encourage some people to give contrarian answers, and virtually guarantee failure), we could choose the coordination point in silence and then explain our choice. (Some wannabe contrarians will still argue for a different answer, but now it will feel like “too late, the others have already agreed, and you are just providing excuses for why you are not there”.)
Instead of asking your tribe to debate, ask yourself what is your tribe’s Schelling point, and then announce it. If you are right, others are likely to agree. (And “Schelling point” itself could be a Schelling point of rationalist group decision.)
(Responding to the object-level-question, which is the least interesting part of this but which is easiest to get started thinking about)
(spoilers)
...
I initially parsed the question as “assume a random person” and picked “a million.” Upon re-read, I see that it specifies Ziz in particular, where a Trillion does seem a like a derivable schelling number, especially if you know Ziz.
(I think the strategically correct answer might still be a trillion for Expected Value reasons, but probably depends a lot on the rest of the “implied world” that the question suggests. A near-guarantee of “never have to work again” is still pretty good. If I’m unlikely to get weird mysterious offers like this again and if I don’t expect other agents similar to me to have the option of gaining a trillion dollars, I’m not sure how to think about it.)
I asked a couple LW-readers who answered “a million” and “a billion.”
A thing that affects my judgment slightly is that “a billion” is the highest number that’s in the zeitgeist as “actual money.” (A trillion is a “real” number, unlike a quadrillion, but it’s not a “real” amount of money). This, coupled with the fact that the billionaires I know of don’t seem to even know what to do with their money to cause good things to happen suggests it might actually be a derivably better answer than a trillion, unless you are knowably talking to extreme maximizers.
I think it’s 1 Trillion. If you want to match most often the answer is obviously 1-Million as the ‘a lot of money’ standard answer. Once you’re willing to ask for more than that, 1 Billion is in a strange place where it’s obviously a lot but also obviously ‘not enough’ in the sense that there are many who would be richer than you, whereas 1 Trillion is clearly ‘enough’ and the only plausible point above a billion, so I don’t think it’s that much less likely once you know the person knows roughly the orders of magnitude involved and the word “trillion”. If the person really is random then 1 Billion becomes more reasonable, but I think it has to be ‘average American’ level random.
I haven’t read the second article yet, so I am not sure where this all goes. But I am thinking about “why our kind can’t cooperate?” and this feels like a possible approach to solve that problem—instead of trying to achieve coordination by talking (which would encourage some people to give contrarian answers, and virtually guarantee failure), we could choose the coordination point in silence and then explain our choice. (Some wannabe contrarians will still argue for a different answer, but now it will feel like “too late, the others have already agreed, and you are just providing excuses for why you are not there”.)
Instead of asking your tribe to debate, ask yourself what is your tribe’s Schelling point, and then announce it. If you are right, others are likely to agree. (And “Schelling point” itself could be a Schelling point of rationalist group decision.)