The surviving worlds look like people who lived inside their awful reality and tried to shape up their impossible chances; until somehow, somewhere, a miracle appeared—the model broke in a positive direction, for once, as does not usually occur when you are trying to do something very difficult and hard to understand, but might still be so—and they were positioned with the resources and the sanity to take advantage of that positive miracle, because they went on living inside uncomfortable reality.
Can you talk more about this? I’m not sure what actions you want people to take based on this text.
What is the difference between a strategy that is dignified and one that is a clever scheme?
I may be misunderstanding, but I interpreted Eliezer as drawing this contrast:
Good Strategy: Try to build maximally accurate models of the real world (even though things currently look bad), while looking for new ideas or developments that could save the world. Ideally, the ideas the field puts a lot of energy into should be ones that already seem likely to work, or that seem likely to work under a wide range of disjunctive scenarios. (Failing that, they at least shouldn’t require multiple miracles, and should lean on a miracle that’s unusually likely.)
Bad Strategy: Reason “If things are as they appear, then we’re screwed anyway; so it’s open season on adopting optimistic beliefs.” Freely and casually adopt multiple assumptions based on wishful thinking, and spend your mental energy thinking about hypothetical worlds where things go unusually well in specific ways you’re hoping they might (even though, stepping back, you wouldn’t have actually bet on those optimistic assumptions being true).
Can you talk more about this? I’m not sure what actions you want people to take based on this text.
What is the difference between a strategy that is dignified and one that is a clever scheme?
I may be misunderstanding, but I interpreted Eliezer as drawing this contrast:
Good Strategy: Try to build maximally accurate models of the real world (even though things currently look bad), while looking for new ideas or developments that could save the world. Ideally, the ideas the field puts a lot of energy into should be ones that already seem likely to work, or that seem likely to work under a wide range of disjunctive scenarios. (Failing that, they at least shouldn’t require multiple miracles, and should lean on a miracle that’s unusually likely.)
Bad Strategy: Reason “If things are as they appear, then we’re screwed anyway; so it’s open season on adopting optimistic beliefs.” Freely and casually adopt multiple assumptions based on wishful thinking, and spend your mental energy thinking about hypothetical worlds where things go unusually well in specific ways you’re hoping they might (even though, stepping back, you wouldn’t have actually bet on those optimistic assumptions being true).
(Endorsed.)
Thanks for endorsing this, that really helped clarify your position.