I’m talking everyday situations. Like “if I push on this door, it will open” or “by next week my laundry hamper will be full” or “it’s probably going to be colder in January than June”. Even with quantum mechanics, people do figure out the pattern and build some intuition, but they need to see a lot of data on it first and most people never study it enough to see that much data.
In places where the humans in question don’t have much first-hand experiential data, or where the data is mostly noise, that’s where human prediction tends to fail. (And those are also the cases where we expect learning systems in general to fail most often, and where we expect the system’s priors to matter most.) Another way to put it: humans’ priors aren’t great, but in most day-to-day prediction problems we have more than enough data to make up for that.
I’m talking everyday situations. Like “if I push on this door, it will open” or “by next week my laundry hamper will be full” or “it’s probably going to be colder in January than June”. Even with quantum mechanics, people do figure out the pattern and build some intuition, but they need to see a lot of data on it first and most people never study it enough to see that much data.
In places where the humans in question don’t have much first-hand experiential data, or where the data is mostly noise, that’s where human prediction tends to fail. (And those are also the cases where we expect learning systems in general to fail most often, and where we expect the system’s priors to matter most.) Another way to put it: humans’ priors aren’t great, but in most day-to-day prediction problems we have more than enough data to make up for that.