I’ve been thinking about non AI catastrophic risks.
One that I’ve not seen talked about is the idea of cancerous ideas. That is ideas that spread throughout a population and crowd out other ideas for attention and resources.
This could lead to civilisational collapse due to basic functions not being performed.
Safeguards for this are partitioning the idea space and some form of immune system that targets ideas that spread uncontrollably.
Unearthing my old dissertation. Still think there is something to it
https://docs.google.com/document/d/1-lmOXSfUXYvbhlFcs04VAzl-mKB8ZJfR/edit?usp=drivesdk&ouid=113969196762487274190&rtpof=true&sd=true