There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren’t hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.
Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you’d be able to remove whatever dangerous memes you made.
Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.
There exist certain ideas that are very dangerous to think. They make you vulnerable to harm at the hands of future super-intelligences. Such ideas aren’t hard to come by, most civilizations stumble upon them. One of the ways to render them inner is to end your existence.
You mean like the basilisk?
Getting rid of your species seems like going overboard. If you saved the children and raised them by robots, you’d be able to remove whatever dangerous memes you made.
Also, if that is a common reaction to the basilisk, then a fundamental assumption in it is the opposite of true. If your response is ignoring it or less, you have nothing to worry about.
Like a basilisk yes.