I find it quite hard to believe you couldn’t do even better if you were a single mind perceiving what the ants did and controlling them (which is how you are set up in this game).
...
If the separate agents all have simultaneous access to the same information … then they cease being separate agents … .
There’s a big difference between separate agents all running in one brain (e.g., possibly humans) and separate agents in separate brains (ants).
I believe there is a significantly false assumption here: that the agents present in human minds are operating with “simultaneous” (or otherwise) access to “the same information”.
To me, that reads as if lavalamp doesn’t think humans actually are a “unified mind”, though. It’s the program written in the context of the game that acts as a single agent by processing the same information with pseudo-‘simultaneity’.
It’s the program written in the context of the game that acts as a single agent by processing the same information with pseudo-‘simultaneity’.
I believe I understand what you are saying here. I just don’t think it fairly describes what lavalamp was saying.
My reading of that passage is that his assertion was that humans, by having separate agents all running “in one brain”, cease being separate agents as a result of “having simultaneous access to the same information”.
EDIT: Okay, now I find myself confused. By the course of the dialogue it’s clear that pedanterrific did not downvote my comment, so someone not replying to it must have. I am left without insight as to why this was done, however.
Well, if you’re correct and that is what lavalamp is asserting, I pretty much agree with you. Humans are definitely not “unified minds”, and the difference between separate agents running on one or multiple brains may be large, but it’s quantitative, not qualitative.
That is, even separate agents running on one brain will never have simultaneous access to the same information (unless you cheat by pausing time).
That is, even separate agents running on one brain will never have simultaneous access to the same information (unless you cheat by pausing time).
Even then it’s important to note that various agents operating on varying principles of how to transform / relate to information might only be “capable” of noting specific subsets of “the same information”, and that this is—I believe—contextually relevant to comparing brains to ant colonies. Just like how the parts of your brain that handle emotions will not be involved in processing the differences between two sounds; two ants each in different locations have access to separate subsets of information which is then relayed to other parts of the colony.
The emotion-parts react to the signals sent by the sound-parts, and vice versa; so to ant(1) reacts to the signals sent by ant(2), and vice versa.
I’m not so sure our anticipations necessarily differ. I think separate agents with amazingly fast communication will approach the performance of a unified mind, and a mind with poor internal communication will approach the performance of separate agents. Human minds arguably might have poor internal communication, but I’m still betting that it’s more than one order of magnitude better than what ants do. I think our disagreement is more about the scale of this difference than anything.
The fundamental barrier to communication inside a single mind is the speed of light; an electronic brain the size of a human one ought to be able to give its sub-agents information that’s pretty damn close to simultaneous.
At any rate, in this game we do have simultaneous knowledge, and there’s no reason to handicap ourselves by e.g., waiting for scouts to return to other ants to share their knowledge.
To me, that reads as if lavalamp doesn’t think humans actually are a “unified mind”, though. It’s the program written in the context of the game that acts as a single agent by processing the same information with pseudo-‘simultaneity’.
I believe I understand what you are saying here. I just don’t think it fairly describes what lavalamp was saying.
My reading of that passage is that his assertion was that humans, by having separate agents all running “in one brain”, cease being separate agents as a result of “having simultaneous access to the same information”.
EDIT: Okay, now I find myself confused. By the course of the dialogue it’s clear that pedanterrific did not downvote my comment, so someone not replying to it must have. I am left without insight as to why this was done, however.
Well, if you’re correct and that is what lavalamp is asserting, I pretty much agree with you. Humans are definitely not “unified minds”, and the difference between separate agents running on one or multiple brains may be large, but it’s quantitative, not qualitative.
That is, even separate agents running on one brain will never have simultaneous access to the same information (unless you cheat by pausing time).
Even then it’s important to note that various agents operating on varying principles of how to transform / relate to information might only be “capable” of noting specific subsets of “the same information”, and that this is—I believe—contextually relevant to comparing brains to ant colonies. Just like how the parts of your brain that handle emotions will not be involved in processing the differences between two sounds; two ants each in different locations have access to separate subsets of information which is then relayed to other parts of the colony.
The emotion-parts react to the signals sent by the sound-parts, and vice versa; so to ant(1) reacts to the signals sent by ant(2), and vice versa.
I’m not the one downvoting you, either.
I’m not so sure our anticipations necessarily differ. I think separate agents with amazingly fast communication will approach the performance of a unified mind, and a mind with poor internal communication will approach the performance of separate agents. Human minds arguably might have poor internal communication, but I’m still betting that it’s more than one order of magnitude better than what ants do. I think our disagreement is more about the scale of this difference than anything.
The fundamental barrier to communication inside a single mind is the speed of light; an electronic brain the size of a human one ought to be able to give its sub-agents information that’s pretty damn close to simultaneous.
At any rate, in this game we do have simultaneous knowledge, and there’s no reason to handicap ourselves by e.g., waiting for scouts to return to other ants to share their knowledge.