I think that if we fully understood the algorithm and had chunked it in our heads so we could just imagine manipulating it any way we liked, then I think we would view it as less agenty. But of course, a lot of our intuitions are rough heuristics and they might misfire in various ways and make us think “agent!” in a way we don’t reflectively endorse (like we don’t endorse “a smiley face--> a person”).
Or, you know, my attempted abstraction of my agent intuitions fails in some way. I think that the stochasticity thing might play a part in being agent. Like, maybe because most agents are power seeking and power seeking behaviour is about leaving lots of options live and thus increasing other’s uncertainty about your future actions. Wasn’t there that paper about entropy which someone linked to in the comments of one of TurnTrout’s “rethinking impact” posts? It was about modeling entropy in a way that shared mathematical structure with impact measures. Of course, there’s also some kinds of logical uncertainty when you model an agent modeling you.
As for the example of dancing, CGI and music I’d say that’s more about “natural/human” vs “unnatural/inhuman” than “agent” vs “not-agent”, though there’s a large inner product between the two axis.
I think that if we fully understood the algorithm and had chunked it in our heads so we could just imagine manipulating it any way we liked, then I think we would view it as less agenty. But of course, a lot of our intuitions are rough heuristics and they might misfire in various ways and make us think “agent!” in a way we don’t reflectively endorse (like we don’t endorse “a smiley face--> a person”).
Or, you know, my attempted abstraction of my agent intuitions fails in some way. I think that the stochasticity thing might play a part in being agent. Like, maybe because most agents are power seeking and power seeking behaviour is about leaving lots of options live and thus increasing other’s uncertainty about your future actions. Wasn’t there that paper about entropy which someone linked to in the comments of one of TurnTrout’s “rethinking impact” posts? It was about modeling entropy in a way that shared mathematical structure with impact measures. Of course, there’s also some kinds of logical uncertainty when you model an agent modeling you.
As for the example of dancing, CGI and music I’d say that’s more about “natural/human” vs “unnatural/inhuman” than “agent” vs “not-agent”, though there’s a large inner product between the two axis.