The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.
I don’t know; the more Less Wrong I read, the more I start to think Lovecraft was on to something.
Delving too far in our search for knowledge is likely to awaken vastgodlike forces which are neither benevolent nor malevolent but horrifyingly indifferent to humanity. Some of these forces may be slightly better or worse than others, but all of them could and would swat our civilization away like a mosquito. Such forces may already control other star systems.
I never read Lovecraft as being any kind of metaphor for the real world, so I wouldn’t vote this up as a rationalist quote for that reason.
But I like it as a device used Lovecraft to try to convey a sheer magnitude of horror. Can you imagine discovering something so horrific you wished you could delete the whole thing from your memory ? The more you pride yourself as a rationalist, the more horrific it would have to be.
-H.P. Lovecraft
Sounds like Caveman Science Fiction to me. “Why should we risk learning about new things, when there’s a possibility they’ll be scary?”
This belongs on the parody site http://morewrong.com. Please build it :-)
I don’t know; the more Less Wrong I read, the more I start to think Lovecraft was on to something.
Delving too far in our search for knowledge is likely to awaken vast godlike forces which are neither benevolent nor malevolent but horrifyingly indifferent to humanity. Some of these forces may be slightly better or worse than others, but all of them could and would swat our civilization away like a mosquito. Such forces may already control other star systems.
The only defense against such abominations is to study the arcane knowledge involved in summoning or banishing these entities; however, such knowledge is likely to cause its students permanent psychological damage or doom them to eternities of torture.
We’ve got Harry Potter and the Methods of Rationality; maybe you should write Cthulhu Mythos and Rationality ?
Then again, it might be unwise to disseminate it openly.
At the Mountains of Sanity
I’ve always enjoyed Vernor Vinge’s name for AI: “Applied Theology”.
(In, I think, A Fire upon the Deep.)
“Theological engineering” has a nice ring to it.
I never read Lovecraft as being any kind of metaphor for the real world, so I wouldn’t vote this up as a rationalist quote for that reason.
But I like it as a device used Lovecraft to try to convey a sheer magnitude of horror. Can you imagine discovering something so horrific you wished you could delete the whole thing from your memory ? The more you pride yourself as a rationalist, the more horrific it would have to be.
This seems to be the premise of Isaac Asimov’s “Nightfall”.