I’m discussing an agent that does in fact take 5 which imagines taking 10 instead. There have been some discussions of decision theory using proof-based agents and how they can run in spurious counterfactual. If you’re confused, you can try searching the archive of this website. I tried earlier today, but couldn’t find particularly good resources to recommend. I couldn’t find a good resource for playing chicken with the universe either.
(I may write a proper article at some point in the future to explain these concepts if I can’t find an article that explains them well)
I’m discussing an agent that does in fact take 5 which imagines taking 10 instead. There have been some discussions of decision theory using proof-based agents and how they can run in spurious counterfactual. If you’re confused, you can try searching the archive of this website. I tried earlier today, but couldn’t find particularly good resources to recommend. I couldn’t find a good resource for playing chicken with the universe either.
(I may write a proper article at some point in the future to explain these concepts if I can’t find an article that explains them well)
Ah, I missed that. That seems like a mental quirk rather than anything fundamental. Then again, maybe you mean something else.