Yes, but expecting any reasoner to develop well-grounded abstract concepts without any grounding in features and then care about them is… well, it’s not actually complete bullshit, but expecting it to actually happen relies on solving some problems I haven’t seen solved.
You could, hypothetically, just program your AI to infer “goodness” as a causal-role concept from the vast sums of data it gains about the real world and our human opinions of it, and then “maximize goodness”, formulated as another causal role. But this requires sophisticated machinery for dealing with causal-role concepts, which I haven’t seen developed to that extent in any literature yet.
Usually, reasoners develop causal-role concepts in order to explain what their feature-level concepts are doing, and thus, causal-role concepts abstracted over concepts that don’t eventually root themselves in features are usually dismissed as useless metaphysical speculation, or at least abstract wankery one doesn’t care about.
Yes, but expecting any reasoner to develop well-grounded abstract concepts without any grounding in features and then care about them is… well, it’s not actually complete bullshit, but expecting it to actually happen relies on solving some problems I haven’t seen solved.
You could, hypothetically, just program your AI to infer “goodness” as a causal-role concept from the vast sums of data it gains about the real world and our human opinions of it, and then “maximize goodness”, formulated as another causal role. But this requires sophisticated machinery for dealing with causal-role concepts, which I haven’t seen developed to that extent in any literature yet.
Usually, reasoners develop causal-role concepts in order to explain what their feature-level concepts are doing, and thus, causal-role concepts abstracted over concepts that don’t eventually root themselves in features are usually dismissed as useless metaphysical speculation, or at least abstract wankery one doesn’t care about.
I don’t think you are responding the the correct comment. Or at least I have no idea what you are talking about.