I would likely infer that thoughts had been exchanged between them, but I wouldn’t be confident that the thoughts which had been exchanged were thoughts that could be translated to a form that I could understand.
Alternative explanations include:
They exchanged genetic material, like bacteria, or outright code, like computer programs; which made them behave more similarly.
They are programs, one attacked the other, killed it and replaced its computational slot with a copy of itself.
A1 gave A2 a copy of its black-box decision maker which both now use to determine their behavior in this situation. However, neither of them understands the black box’s decision algorithm on the level of their own conscious thoughts; and the black box itself is not sentient or alive and has no thoughts.
One of them observed the other was more efficient and is now emulating its behavior, but they didn’t talk about it (“exchange thoughts”), just looked at one another.
These are, of course, not exhaustive.
You could call some these cases a kind of thought. Maybe to self-modifying programs, a blackbox executable algorithm counts as a thought; or maybe to beings who use the same information storage for genes and minds, lateral gene transfer counts as a thought.
But this is really just a matter of defining what the word “thought” may refer to. I can define it to include executable undocumented Turing Machines, which I don’t think humans like us can “think”. Or you could define it as something that, after careful argument, reduces to “whatever humans can think and no more”.
Sure. Leaving aside what we properly attach the label “thought” to, the thing I’m talking about in this context is roughly speaking the executed computations that motivate behavior. In that sense I would accept many of these options as examples of the thing I was talking about, although option 2 in particular is primarily something else and thus somewhat misleading to talk about that way.
Alternative explanations include:
They exchanged genetic material, like bacteria, or outright code, like computer programs; which made them behave more similarly.
They are programs, one attacked the other, killed it and replaced its computational slot with a copy of itself.
A1 gave A2 a copy of its black-box decision maker which both now use to determine their behavior in this situation. However, neither of them understands the black box’s decision algorithm on the level of their own conscious thoughts; and the black box itself is not sentient or alive and has no thoughts.
One of them observed the other was more efficient and is now emulating its behavior, but they didn’t talk about it (“exchange thoughts”), just looked at one another.
These are, of course, not exhaustive.
You could call some these cases a kind of thought. Maybe to self-modifying programs, a blackbox executable algorithm counts as a thought; or maybe to beings who use the same information storage for genes and minds, lateral gene transfer counts as a thought.
But this is really just a matter of defining what the word “thought” may refer to. I can define it to include executable undocumented Turing Machines, which I don’t think humans like us can “think”. Or you could define it as something that, after careful argument, reduces to “whatever humans can think and no more”.
Sure. Leaving aside what we properly attach the label “thought” to, the thing I’m talking about in this context is roughly speaking the executed computations that motivate behavior. In that sense I would accept many of these options as examples of the thing I was talking about, although option 2 in particular is primarily something else and thus somewhat misleading to talk about that way.