It seems like humans get a lot of use out of concepts like “agent” and “extortion” even though in principle functional decision theory is simpler. Functional decision theory may just never be computationally tractable outside of radically simplified toy problems.
It seems like humans get a lot of use out of concepts like “agent” and “extortion” even though in principle functional decision theory is simpler. Functional decision theory may just never be computationally tractable outside of radically simplified toy problems.