This one may be breaking the rules due to the use of too many Oracles. If so, please strike this submission. Submission: In round one, for three counterfactual Oracles, draft an international agreement to achieve goals X, Y, or Z (counterfactually if we didn’t see the Oracle’s answer). Sample goals: reduction in nuclear war risk, reduction in global deaths due to starvation, increase in asteroid detection capabilities, raising the global GDP growth rate, etc.
In round two, for a low bandwidth Oracle, present the three agreements and ask it to pick the agreement most likely to achieve the stated goal.
The counterfactual Oracle with the selected agreement would be rewarded. The reward for the low bandwidth Oracle could be determined as in my prior submission, either directly by measuring the results of the treaty, or as evaluated by an independent third party such as Metaculus.
This one may be breaking the rules due to the use of too many Oracles. If so, please strike this submission.
Submission: In round one, for three counterfactual Oracles, draft an international agreement to achieve goals X, Y, or Z (counterfactually if we didn’t see the Oracle’s answer). Sample goals: reduction in nuclear war risk, reduction in global deaths due to starvation, increase in asteroid detection capabilities, raising the global GDP growth rate, etc.
In round two, for a low bandwidth Oracle, present the three agreements and ask it to pick the agreement most likely to achieve the stated goal.
The counterfactual Oracle with the selected agreement would be rewarded. The reward for the low bandwidth Oracle could be determined as in my prior submission, either directly by measuring the results of the treaty, or as evaluated by an independent third party such as Metaculus.