In the same way smart Christians have a limited amount of time to become an atheist before they irrecoverably twist their minds into an escher painting justifying their theological beliefs, I propose people have a limited amount of time to see the danger behind bash+GPT-100 before they become progressively more likely to make up some pseudophilosophical argument about AI alignment being an ill posed question, and thus they’re not gonna get eaten by nanobots.
In the same way smart Christians have a limited amount of time to become an atheist before they irrecoverably twist their minds into an escher painting justifying their theological beliefs, I propose people have a limited amount of time to see the danger behind bash+GPT-100 before they become progressively more likely to make up some pseudophilosophical argument about AI alignment being an ill posed question, and thus they’re not gonna get eaten by nanobots.