https://www.lesswrong.com/posts/jnyTqPRHwcieAXgrA/finding-goals-in-the-world-model
Could it be possible to poison the world model an AGI is based on to cripple its power?
Use generated text/data to train world models based on faulty science like miasma, phlogiston, ether, etc.
Remove all references to the internet or connectivity based technology.
Create a new programming language that has zero real world adoption, and use that for all code based data in the training set.
https://www.lesswrong.com/posts/jnyTqPRHwcieAXgrA/finding-goals-in-the-world-model
Could it be possible to poison the world model an AGI is based on to cripple its power?
Use generated text/data to train world models based on faulty science like miasma, phlogiston, ether, etc.
Remove all references to the internet or connectivity based technology.
Create a new programming language that has zero real world adoption, and use that for all code based data in the training set.