Not a very technical objection but I have to say, I feel like simulating the demon Azazel who wants to maximize paperclips but is good at predicting text because he’s a clever, hardworking strategist… doesn’t feel very simple to me at all? It seems like a program that just predicts text would almost have to be simpler than simulating a genius mind with some other goal which cleverly chooses to predict text for instrumental reasons, to me.
Not a very technical objection but I have to say, I feel like simulating the demon Azazel who wants to maximize paperclips but is good at predicting text because he’s a clever, hardworking strategist… doesn’t feel very simple to me at all? It seems like a program that just predicts text would almost have to be simpler than simulating a genius mind with some other goal which cleverly chooses to predict text for instrumental reasons, to me.