There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren’t hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.
Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you’d be able to remove whatever dangerous memes you made.
Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.
Idea: The Great Filter as a self-imposed measure by sentient life to mitigate inevitable early thought experiment blunders in their histories.
I don’t understand. Can you explain that more?
There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren’t hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.
You mean like the basilisk?
Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you’d be able to remove whatever dangerous memes you made.
Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.
Like a basilisk yes.
Idea: The concept of a great filter is a collective failure of imagination on the part of humanity, amplified by a severe lack of data.
Yeah, I (and others) have been saying this here and elsewhere.