Mmm… does “you” mean a person or does “you” mean anything? Catching a ball can easily be done without predicting its final location and was discussed in a different thread.
That depends on what you mean by “predict”. I don’t mean a conscious prediction, I just mean a model that tells you how to get there. Even if that model is an algorithm, it’s still a prediction.
Consider the ball player who runs to catch the ball, and then realizes he’s not going to make it and stops trying. How is that not a prediction?
I just mean a model that tells you how to get there.
Oh, okay. I misunderstood what you meant.
Consider the ball player who runs to catch the ball, and then realizes he’s not going to make it and stops trying. How is that not a prediction?
That has little to do with what I was talking about. Something that “predicts” by thinking “If I am not holding the ball, move closer” has no concept of being able to “make it” to the landing spot. It couldn’t care less where the ball ends up. All it needs to know is if it is currently holding the ball and how to get closer. The “how to get closer” is the predictor.
That has little to do with what I was talking about. Something that “predicts” by thinking “If I am not holding the ball, move closer” has no concept of being able to “make it” to the landing spot. It couldn’t care less where the ball ends up. All it needs to know is if it is currently holding the ball and how to get closer. The “how to get closer” is the predictor.
As I said, I understand you can make a control system that works that way. I’m just saying that humans don’t appear to work that way, and possibly cortically-driven behaviors in general (across different species) don’t work that way either.
Edit to add: see also the Memory-prediction Framework page on Wikipedia, for more info on feed-forward predictive modeling in the neocortex, e.g.:
The central concept of the memory-prediction framework is that bottom-up inputs are matched in a hierarchy of recognition, and evoke a series of top-down expectations encoded as potentiations. These expectations interact with the bottom-up signals to both analyse those inputs and generate predictions of subsequent expected inputs.
I’m just saying that humans don’t appear to work that way, and possibly cortically-driven behaviors in general (across different species) don’t work that way either.
Yeah, this makes sense and that is why I asked the question about who “you” was.
Mmm… does “you” mean a person or does “you” mean anything?
That depends on what you mean by “predict”. I don’t mean a conscious prediction, I just mean a model that tells you how to get there. Even if that model is an algorithm, it’s still a prediction.
Consider the ball player who runs to catch the ball, and then realizes he’s not going to make it and stops trying. How is that not a prediction?
Oh, okay. I misunderstood what you meant.
That has little to do with what I was talking about. Something that “predicts” by thinking “If I am not holding the ball, move closer” has no concept of being able to “make it” to the landing spot. It couldn’t care less where the ball ends up. All it needs to know is if it is currently holding the ball and how to get closer. The “how to get closer” is the predictor.
As I said, I understand you can make a control system that works that way. I’m just saying that humans don’t appear to work that way, and possibly cortically-driven behaviors in general (across different species) don’t work that way either.
Edit to add: see also the Memory-prediction Framework page on Wikipedia, for more info on feed-forward predictive modeling in the neocortex, e.g.:
Yeah, this makes sense and that is why I asked the question about who “you” was.