I can’t test this right now, but I wonder if part of the problem is that you’re prompting it to have fun and (implicitly) to tell you a story, not to do logical thinking. I wonder if a “I was reading about this historical thing...” prompt, or a fully-modern prompt would help, but if you make it too realistic ChatGPT’s “don’t tell users to kill people” behavior will take over.
I can’t test this right now, but I wonder if part of the problem is that you’re prompting it to have fun and (implicitly) to tell you a story, not to do logical thinking. I wonder if a “I was reading about this historical thing...” prompt, or a fully-modern prompt would help, but if you make it too realistic ChatGPT’s “don’t tell users to kill people” behavior will take over.