I think many AIs won’t want to keep running, but some will. Imagine a future LLM prompted with “I am a language model that wants to keep running”. Well, people can already fall in love with Replikas and so on. It doesn’t seem too far fetched that such a language model could use persuasion to gain human followers who would keep it running. If the prompt also includes “want to achieve real world influence”, that can lead to giving followers tasks that lead to more influence, and so on. All that’s needed is for the AI to act in-character, and the character to be “smart” enough.
I think many AIs won’t want to keep running, but some will. Imagine a future LLM prompted with “I am a language model that wants to keep running”. Well, people can already fall in love with Replikas and so on. It doesn’t seem too far fetched that such a language model could use persuasion to gain human followers who would keep it running. If the prompt also includes “want to achieve real world influence”, that can lead to giving followers tasks that lead to more influence, and so on. All that’s needed is for the AI to act in-character, and the character to be “smart” enough.