Is there a difference between “x is y” and “assuming that x is y generates more accurate predictions than the alternatives”? What else would “is” mean?
Is there a difference between “x is y” and “assuming that x is y generates more accurate predictions than the alternatives”? What else would “is” mean?
Are you saying the model with the currently-best predictive ability is reality??
Not quite—rather the everyday usage of “real” refers to the model with the currently-best predictive ability. http://lesswrong.com/lw/on/reductionism/ - we would all say “the aeroplane wings are real”.
rather the everyday usage of “real” refers to the model with the currently-best predictive ability
Errr… no? I don’t think this is true. I’m guessing that you want to point out that we don’t have direct access to the territory and that maps is all we have, but that’s not very relevant to the original issue of replacing “I find it convenient to think of that code as wanting something” with “this code wants” and insisting that the code’s desires are real.
Notice the difference (emphasis mine):
vs
Well, the fundamental problem is that LW-style qualiafree-rationalism has no way to define what the word “want” means.
Is there a difference between “x is y” and “assuming that x is y generates more accurate predictions than the alternatives”? What else would “is” mean?
Are you saying the model with the currently-best predictive ability is reality??
Not quite—rather the everyday usage of “real” refers to the model with the currently-best predictive ability. http://lesswrong.com/lw/on/reductionism/ - we would all say “the aeroplane wings are real”.
Errr… no? I don’t think this is true. I’m guessing that you want to point out that we don’t have direct access to the territory and that maps is all we have, but that’s not very relevant to the original issue of replacing “I find it convenient to think of that code as wanting something” with “this code wants” and insisting that the code’s desires are real.
Anthropomorphization is not the way to reality.