you have now managed to divert this conversation into a discussion on philosophy of language,
The discussion is about the Chinese Rooms, the .CR is about semantics, and philosophy of language is relevant to semantics, so I dont see the diversion.
None of that applies to virtual reality, which was the part of the article you originally took umbrage at.
I don’t see why not. VR isnt a place where things exist, it rather series on representation and intentionally ad much as a picture or novel.
What I didn’t say before was that VR is particularly lproblematicalin the case of the CR, because a CR that is grounding its symbols in VR could reconfigure itself with the VR run internally...you then have the paradox of a CR with grounded symbols supposedly, but no contact with the outside world.
What is the referent of “Protoss Carrier”, if not “computer-generated video game model of the Protoss Carrier”?
As I stated before, I don’t think there has to be one.
That gives me a way of cashing out the fact/fiction distinction. I don’t know what your alternative is, because you haven’t given one,
As before, you seem to be arguing from a tacit assumption that all terms must have referents.
Irrelevant....irrelevant.
I was arguing against the all-must-have-referents theory, and inasmuch as you are still using it, it is still relevant.
you can easily tell from context besides.
I was tacitly using the assumption, common in analytical philosophy, that theories .should be built on cleaned up versions of natural language...hence, “normatively”. How do you build theories, dxu?
Stick to the topic at hand, please.
So what do you think of the CR, dxu?
Something which does not exist in physical reality except by being instantiated in some virtual representation or map, and is also claimed to exist in physical reality.
I didn’t say that. Things dont, properly speaking, exist in virtual reality, they are represented in it. Which adds nothing to neural representations.
That being the case, I .am saying a fictional term...
....has no referent...
2,....and is not intended to.
But these are not different theories. 2 is just a ramifications of 1, a second iteration.
I didn’t say that. Things dont, properly speaking, exist in virtual reality, they are represented in it.
Right, and terms for objects in virtual reality can refer to those representations. That’s the same thing I was getting at with my distinction involving Santa and neural patterns, except in this case, there’s no claim that the VR object exists in the physical world, and thus nothing to get confused about. Hence, your objection to Santa does not apply here.
The linked article doesn’t reality demonstrate that. In particular, if you are going to appeal to robot bodies as giving a level of causal connection sufficient to ground symbols, then Searle still has a point about the limitations of abstract, unembodied, software.
And how is the “all-must-have-referents theory” relevant to Stuart Armstrong’s original example which you first took issue with?
I’m bringing it up because you are. Its like you’re saying it’s OK for you to appeal to unjustified premises, but if I bring it up, I’m at fault for changing the subject.
build theories by observing how real humans communicate.
If that means taking their statements at face value, without allowing for metaphor .or misleading phraseology....then I have to tell you that there is a thing called a cold that someone can catch, and a thing called a temper someone can lose.
Right, and terms for objects in virtual reality can refer to those representations
Is that a fact? In particular is it is it a fact that people are referring in my technical sense of “refering” , and not in some loose and popular sense , eg talking about,
The discussion is about the Chinese Rooms, the .CR is about semantics, and philosophy of language is relevant to semantics, so I dont see the diversion.
I don’t see why not. VR isnt a place where things exist, it rather series on representation and intentionally ad much as a picture or novel.
What I didn’t say before was that VR is particularly lproblematicalin the case of the CR, because a CR that is grounding its symbols in VR could reconfigure itself with the VR run internally...you then have the paradox of a CR with grounded symbols supposedly, but no contact with the outside world.
As I stated before, I don’t think there has to be one.
That gives me a way of cashing out the fact/fiction distinction. I don’t know what your alternative is, because you haven’t given one,
As before, you seem to be arguing from a tacit assumption that all terms must have referents.
I was arguing against the all-must-have-referents theory, and inasmuch as you are still using it, it is still relevant.
I was tacitly using the assumption, common in analytical philosophy, that theories .should be built on cleaned up versions of natural language...hence, “normatively”. How do you build theories, dxu?
So what do you think of the CR, dxu?
I didn’t say that. Things dont, properly speaking, exist in virtual reality, they are represented in it. Which adds nothing to neural representations.
That being the case, I .am saying a fictional term...
....has no referent...
2,....and is not intended to.
But these are not different theories. 2 is just a ramifications of 1, a second iteration.
...What?
And how is the “all-must-have-referents theory” relevant to Stuart Armstrong’s original example which you first took issue with?
I build theories by observing how real humans communicate.
Right, and terms for objects in virtual reality can refer to those representations. That’s the same thing I was getting at with my distinction involving Santa and neural patterns, except in this case, there’s no claim that the VR object exists in the physical world, and thus nothing to get confused about. Hence, your objection to Santa does not apply here.
From my first reply:
The linked article doesn’t reality demonstrate that. In particular, if you are going to appeal to robot bodies as giving a level of causal connection sufficient to ground symbols, then Searle still has a point about the limitations of abstract, unembodied, software.
I’m bringing it up because you are. Its like you’re saying it’s OK for you to appeal to unjustified premises, but if I bring it up, I’m at fault for changing the subject.
If that means taking their statements at face value, without allowing for metaphor .or misleading phraseology....then I have to tell you that there is a thing called a cold that someone can catch, and a thing called a temper someone can lose.
Is that a fact? In particular is it is it a fact that people are referring in my technical sense of “refering” , and not in some loose and popular sense , eg talking about,