Here’s a dumb question… In the version of this paradox where some agent can perfectly predict the future, why is it meaningful or useful to talk about “decisions” one might make?
Because that agent is following predictable rules. It’s more like working with a force of nature than with an entity we’d intuitively think of as having free will.
Here’s a dumb question… In the version of this paradox where some agent can perfectly predict the future, why is it meaningful or useful to talk about “decisions” one might make?
Because that agent is following predictable rules. It’s more like working with a force of nature than with an entity we’d intuitively think of as having free will.