a Scandinavian country which set aside an island for growing big trees for making wooden warships in the 1900s,
One could also see this as part of a diversified investment strategy. Putting aside some existing ressources for future use is surely not a bad idea. The inteded purpose may have been ‘wrong’. But as you say: It can have an unanticipated benefit.
Well, it’s up to you to decide how much the uncertainty of outcome should influence your willingness to do something. It’s OK to think it’s worthwhile to follow a certain path even if you don’t know where would it ultimately lead.
“Uncertainty” is different than “no clue.” Or maybe I’m assuming too much about what you mean by “no clue”—to my ear it sounds like saying we have no basis for action.
You don’t have more information about the hundred-year effects of your third-world poverty options than you do about the hundred-year effects of your AI options.
Sure if you intended it for one special purpose and just got lucky with another purpose it would be a good excuse. We don’t know what the Scandinavians reasoned other than the possibly often retold war-skip story.
The lesson: If you reserve ressources for a specific purpose either make sure to allow more general usage or reserve multiple different ressources for other purposes too.
One could also see this as part of a diversified investment strategy. Putting aside some existing ressources for future use is surely not a bad idea. The inteded purpose may have been ‘wrong’. But as you say: It can have an unanticipated benefit.
And that seeing would be an excellent example of a post-factum justification of an error.
or an argument that we should act so that even if we are in error the consequences are not dire.
I submit that none of us has a clue as to the consequences in a hundred years of what we are doing now.
Really? Is this something you’ve said before and I’ve missed it? If true, it has huge implications.
I don’t think I’ve said it before in these words but I may have expressed the same idea.
Why do you think there are huge implications?
If I believe that, I would forget about AI, x-risk and just focus on third-world poverty.
Well, it’s up to you to decide how much the uncertainty of outcome should influence your willingness to do something. It’s OK to think it’s worthwhile to follow a certain path even if you don’t know where would it ultimately lead.
“Uncertainty” is different than “no clue.” Or maybe I’m assuming too much about what you mean by “no clue”—to my ear it sounds like saying we have no basis for action.
Large amounts of uncertainty including the paradoxical possibility of black swans == no clue.
You have no basis for action if you are going to evaluate your actions on the basis of consequences in a hundred years.
You don’t have more information about the hundred-year effects of your third-world poverty options than you do about the hundred-year effects of your AI options.
Effects of work on AI are all about the long run. Working on third-world poverty, on the other hand, has important and measurable short-run benefits.
Good point!
Sure if you intended it for one special purpose and just got lucky with another purpose it would be a good excuse. We don’t know what the Scandinavians reasoned other than the possibly often retold war-skip story.
The lesson: If you reserve ressources for a specific purpose either make sure to allow more general usage or reserve multiple different ressources for other purposes too.