I don’t think that “we manage to find a smart way to avoid a disaster, though we almost lose anyway” implies “being smart automatically means that we win”.
I said nothing about smartness automatically meaning that we win. I point is more that the universe doesn’t care about whether you are smart. It’s the core of what Beyond the Reach of God is about. I’m used to contact with it once a year at the solstice.
For me it’s an important part of the core narrative of the solstice and that the world isn’t just letting the hero win because he comes up with a smart solution.
I think there’s a huge danger if people think that being smart and caring about AI safety is enough and then push forward projects like OpenAI that increase capabilities.
To the extend that fiction can teach narratives to people the Beyond the Reach of God narrative seems important.
I don’t think the intellectual work of finding a concrete way that’s likely makes humanity survive an AGI going foom is currently done. If there would be a concrete way, the problem would be a lot less problematic.
Hopefully, places like MIRI and FHI will do that work in the future. So I would expect people to take it seriously to support organizations like MIRI and FHI over OpenAI which pushes for capability increases.
I don’t think that “we manage to find a smart way to avoid a disaster, though we almost lose anyway” implies “being smart automatically means that we win”.
I said nothing about smartness automatically meaning that we win. I point is more that the universe doesn’t care about whether you are smart. It’s the core of what Beyond the Reach of God is about. I’m used to contact with it once a year at the solstice.
For me it’s an important part of the core narrative of the solstice and that the world isn’t just letting the hero win because he comes up with a smart solution.
I think there’s a huge danger if people think that being smart and caring about AI safety is enough and then push forward projects like OpenAI that increase capabilities.
To the extend that fiction can teach narratives to people the Beyond the Reach of God narrative seems important.
Who specifically do you think should act differently, and in what concrete way because they are more aware of the Beyond the Reach of God narrative?
I don’t think the intellectual work of finding a concrete way that’s likely makes humanity survive an AGI going foom is currently done. If there would be a concrete way, the problem would be a lot less problematic.
Hopefully, places like MIRI and FHI will do that work in the future. So I would expect people to take it seriously to support organizations like MIRI and FHI over OpenAI which pushes for capability increases.