When a system achieves sufficient complexity, we have a tendency to reify it. I don’t know what that bias is called.
Me neither, but the fundamental attribution bias is (I think) related to it.
That is, I suspect that the same mechanisms that leave me predisposed to treat an observed event as evidence of a hypothesized attribute of an entity (even when it’s much more strongly evidence of a distributed function of a system) also leave me predisposed to treat the event as evidence of a hypothesized entity.
Labels aside, it’s not a surprising property: when it came to identifying entities in our ancestral environment, I suspect that false negatives exerted more negative selection pressure than false positives.
I think the tendency to treat events as evidence of entities more than is warranted is called “agency bias,” or “delusions of agency” when it’s unusuallly strong.
Me neither, but the fundamental attribution bias is (I think) related to it.
That is, I suspect that the same mechanisms that leave me predisposed to treat an observed event as evidence of a hypothesized attribute of an entity (even when it’s much more strongly evidence of a distributed function of a system) also leave me predisposed to treat the event as evidence of a hypothesized entity.
Labels aside, it’s not a surprising property: when it came to identifying entities in our ancestral environment, I suspect that false negatives exerted more negative selection pressure than false positives.
I think the tendency to treat events as evidence of entities more than is warranted is called “agency bias,” or “delusions of agency” when it’s unusuallly strong.
Sometimes called promiscuous teleology.
Thank you.
Thank you.