Given that the AI apparently doesn’t understand that things are more likely to happen if given more time, I’m somewhat skeptical that it will perform well in real forecasts.
Is this necessarily true? Say there is tighter nuclear regulation being enacted in 2031, or nuclear material will run out in the 2030s, or it expects peace to happen in the 2030s? Would these situations not reduce the likelihood of Iran having a nuke? I would expect with all things being equal the likelihood going up over time, but external events may cause them to decrease more than they increase.
I asked the forecasting AI three questions:
Will iran possess a nuclear weapon before 2030:
539′s Answer: 35%
Will iran possess a nuclear weapon before 2040:
539′s Answer: 30%
Will Iran posses a nuclear weapon before 2050:
539′s answer: 30%
Given that the AI apparently doesn’t understand that things are more likely to happen if given more time, I’m somewhat skeptical that it will perform well in real forecasts.
Is this necessarily true? Say there is tighter nuclear regulation being enacted in 2031, or nuclear material will run out in the 2030s, or it expects peace to happen in the 2030s? Would these situations not reduce the likelihood of Iran having a nuke? I would expect with all things being equal the likelihood going up over time, but external events may cause them to decrease more than they increase.
It doesn’t say “during 2040”, it says “before 2040″.