That’s consistent with the following modified claim: in the absence of firm knowledge of how agenty a thing “really” is, you will tend to take its unpredictability as an indication of agentiness.
However, I am skeptical about that too; the results of die rolls and coin flips don’t seem every agenty to most people (though to some gamblers I believe they do). Perhaps what it takes is a combination of pattern and unpredictability? If your predictions are distinctly better than chance but nothing you can think of makes them perfect, that feels like agency. Especially if the difference between your best predictions and reality isn’t a stream of small random-looking errors but has big fat tails with occasional really large errors. Maybe.
the results of die rolls and coin flips don’t seem every agenty to most people
I think the perception of agency is linked not to unpredictability, but rather to the feeling of “I don’t understand”.
Coin flips are unpredictable, but we understand them very well. Weather is (somewhat) unpredictable as well, but we all have a lot of experience with it and think we understand it. But some kind of complex behaviour and we have no idea what’s behind it? Must be agency.
That’s consistent with the following modified claim: in the absence of firm knowledge of how agenty a thing “really” is, you will tend to take its unpredictability as an indication of agentiness.
I think unpredictability is a complete red herring here. What I notice about the original examples is that the perceived lack of agency was not merely because the game-player was predictable, but because they were predictably wrong. Had they been predictably right, in the sense that the expert player watching them had a sense of understanding from their play how they were thinking, and judging their strategy favourably, I doubt the expert would be saying they were “playing like a robot”.
I happen to have a simulation of a robot here. (Warning: it’s a Java applet, so if you really want to run it you may have to jump through security hoops to convince your machine to do so.) In hunting mode, it predictably finds and eats the virtual food particles. I am quite willing to say it has agency, even though I wrote it and know exactly how it works. A limited agency, to be sure, compared with humans, but the same sort of thing.
That’s consistent with the following modified claim: in the absence of firm knowledge of how agenty a thing “really” is, you will tend to take its unpredictability as an indication of agentiness.
However, I am skeptical about that too; the results of die rolls and coin flips don’t seem every agenty to most people (though to some gamblers I believe they do). Perhaps what it takes is a combination of pattern and unpredictability? If your predictions are distinctly better than chance but nothing you can think of makes them perfect, that feels like agency. Especially if the difference between your best predictions and reality isn’t a stream of small random-looking errors but has big fat tails with occasional really large errors. Maybe.
I think the perception of agency is linked not to unpredictability, but rather to the feeling of “I don’t understand”.
Coin flips are unpredictable, but we understand them very well. Weather is (somewhat) unpredictable as well, but we all have a lot of experience with it and think we understand it. But some kind of complex behaviour and we have no idea what’s behind it? Must be agency.
I think unpredictability is a complete red herring here. What I notice about the original examples is that the perceived lack of agency was not merely because the game-player was predictable, but because they were predictably wrong. Had they been predictably right, in the sense that the expert player watching them had a sense of understanding from their play how they were thinking, and judging their strategy favourably, I doubt the expert would be saying they were “playing like a robot”.
I happen to have a simulation of a robot here. (Warning: it’s a Java applet, so if you really want to run it you may have to jump through security hoops to convince your machine to do so.) In hunting mode, it predictably finds and eats the virtual food particles. I am quite willing to say it has agency, even though I wrote it and know exactly how it works. A limited agency, to be sure, compared with humans, but the same sort of thing.