I don’t know; the more Less Wrong I read, the more I start to think Lovecraft was on to something.
Delving too far in our search for knowledge is likely to awaken vastgodlike forces which are neither benevolent nor malevolent but horrifyingly indifferent to humanity. Some of these forces may be slightly better or worse than others, but all of them could and would swat our civilization away like a mosquito. Such forces may already control other star systems.
I don’t know; the more Less Wrong I read, the more I start to think Lovecraft was on to something.
Delving too far in our search for knowledge is likely to awaken vast godlike forces which are neither benevolent nor malevolent but horrifyingly indifferent to humanity. Some of these forces may be slightly better or worse than others, but all of them could and would swat our civilization away like a mosquito. Such forces may already control other star systems.
The only defense against such abominations is to study the arcane knowledge involved in summoning or banishing these entities; however, such knowledge is likely to cause its students permanent psychological damage or doom them to eternities of torture.
We’ve got Harry Potter and the Methods of Rationality; maybe you should write Cthulhu Mythos and Rationality ?
Then again, it might be unwise to disseminate it openly.
At the Mountains of Sanity
I’ve always enjoyed Vernor Vinge’s name for AI: “Applied Theology”.
(In, I think, A Fire upon the Deep.)
“Theological engineering” has a nice ring to it.