“physical laws” and “universe” maybe suppose too much background.
I cross-pollinate your thing with EY’s:
“If you test theories by how precisely they predict experimental results, you will learn how to make powerful weapons.”
EDIT: My latest version is “If you test theories by how precisely they predict experimental results, you will unlock the secrets of the ancients.” Which fixes a few bugs.
If you get into the practice of keeping honest and accurate records of everything, and quantifying as much as you can, then you will become much better at military logistics.
If you just want incentives then I’d go with—“In 500 years, a gamma ray burst will wipe out all humanity unless you colonize distant stars, so get to work.”
Afterall, ‘powerful weapons’ presumably caused the problem in the first place. A burning, racially/religious/culturally rooted drive to reach the stars would be far more useful in the long run than a desire to conquer our enemies, even if it is based on a lie.
It would be no different than how “germs cause diseases” would be dismissed as not trusting in evil spirits, or the atomic hypothesis being the rantings of a madman. Presumably anything we tell them they’ll have to take on faith until they can check for themselves.
And by the time they’re putting up orbital telescopes to look for possible gamma ray burst candidates, I think humanity will be in a safe enough position.
There would BE a claim, for starters… Excellent point though, you’d need some additional evidence or stagecraft to impress them, which probably counts as increasing the size of the message.
Science has been a win in cultures in which knowledge hasn’t exclusively been pursued for the purposes of killing people. It’s not merely that it’s a post-apocalyptic future, it’s a problem that there has historically been a self-selection process about who pursued knowledge, which is getting subverted here.
For the obvious example, many of the best scientists working on the Manhattan Project only agreed because they were worried about Germany getting nuclear weapons first and what it would mean; likewise, scientists in Germany were deliberately sabotaging their own research.
There’s a safety feature here that this quote, to the extent that it is effective at all, deliberately attempts to remove.
I’m unconvinced that scientific progress is an existential risk, and the increased wealth scientific progress has created has funded or inspired most social progress.
Scientific progress for the explicit and deliberate purpose of killing people more efficiently is a different animal than scientific progress more generally. You’re engaging in an association fallacy, specifically honor by association (although that fallacy is more often used to refer to individuals or organizations rather than abstract concepts).
Yes, that is the essence of our disagreement. You think I’m committing an association fallacy, and I think you are artificially dividing science in ways that don’t reflect actual historic scientific practice.
“If you perform experiments to determine the physical laws of our universe, you will learn how to make powerful weapons.”
It’s all about incentives.
“physical laws” and “universe” maybe suppose too much background.
I cross-pollinate your thing with EY’s:
“If you test theories by how precisely they predict experimental results, you will learn how to make powerful weapons.”
EDIT: My latest version is “If you test theories by how precisely they predict experimental results, you will unlock the secrets of the ancients.” Which fixes a few bugs.
That parses as ‘do not let others conduct experiments’. Probably not what you’re aiming for.
Ooops.
“If everyone tests theories by how precisely they predict experimental results, the secrets of the ancients will be unlocked.”
“If everyone tests theories by how precisely they predict experimental results, you will learn to cure illnesses and solve many other problems.”
“If you test theories by how precisely they predict experimental results, you will have many more opportunities to have sex and look cool.”
i find this unconvincing. i think i might just go wallow in the muck instead.
If you get into the practice of keeping honest and accurate records of everything, and quantifying as much as you can, then you will become much better at military logistics.
If you just want incentives then I’d go with—“In 500 years, a gamma ray burst will wipe out all humanity unless you colonize distant stars, so get to work.”
Afterall, ‘powerful weapons’ presumably caused the problem in the first place. A burning, racially/religious/culturally rooted drive to reach the stars would be far more useful in the long run than a desire to conquer our enemies, even if it is based on a lie.
How would their perception of that claim differ from our perception of the mayans’ claim about 12/21/12?
It would be no different than how “germs cause diseases” would be dismissed as not trusting in evil spirits, or the atomic hypothesis being the rantings of a madman. Presumably anything we tell them they’ll have to take on faith until they can check for themselves.
And by the time they’re putting up orbital telescopes to look for possible gamma ray burst candidates, I think humanity will be in a safe enough position.
There would BE a claim, for starters… Excellent point though, you’d need some additional evidence or stagecraft to impress them, which probably counts as increasing the size of the message.
Do you think wisdom automatically follows knowledge, however?
I think we can take it as given that even with the nukes, science has been a win. Then again, we are talking about a post-apocalyptic future....
Science has been a win in cultures in which knowledge hasn’t exclusively been pursued for the purposes of killing people. It’s not merely that it’s a post-apocalyptic future, it’s a problem that there has historically been a self-selection process about who pursued knowledge, which is getting subverted here.
We need some historical cites, because I don’t know what you are talking about.
For the obvious example, many of the best scientists working on the Manhattan Project only agreed because they were worried about Germany getting nuclear weapons first and what it would mean; likewise, scientists in Germany were deliberately sabotaging their own research.
There’s a safety feature here that this quote, to the extent that it is effective at all, deliberately attempts to remove.
Brief research suggests this might not be true.
I’m unconvinced that scientific progress is an existential risk, and the increased wealth scientific progress has created has funded or inspired most social progress.
Scientific progress for the explicit and deliberate purpose of killing people more efficiently is a different animal than scientific progress more generally. You’re engaging in an association fallacy, specifically honor by association (although that fallacy is more often used to refer to individuals or organizations rather than abstract concepts).
Yes, that is the essence of our disagreement. You think I’m committing an association fallacy, and I think you are artificially dividing science in ways that don’t reflect actual historic scientific practice.
That’s a good point. Maybe we can come up with a better incentive.
“If you do X, you will unlock the secrets of the ancients.”
.
For what reason do you believe this?
.
Strictly one? Or overwhelmingly one?
Wisdom seems rare in all cases; knowledge, however, is common.
.