Finally, the electron is found at some certain polarisation. You just don’t know which before actually doing the experiment (same as for the coin) and you can’t make in principle (at least according to present model of physics—don’t forget that non-local hidden variables are not ruled out) any observation which tells you the result with more certainty in advance (for coin you can). So, the difference is that the future of a classical system can be predicted with unlimited certainty from its present state, while for quantum system not so. This doesn’t necessarily mean that the future is not determined. One can adopt the viewpoint (I think that it was even suggested on OB/LW in Eliezer’s posts about timeless physics) that future is symmetric to the past—it exists in the whole history of universe, and if we don’t know it now, it’s our ignorance. I suppose you would agree that not knowing about the electron’s past is a matter of our ignorance rather than a property of the electron itself, without regard to whether we are able to calculate it from presently available information, even in principle (i.e. using present theories).
I also think that it has little merit to engage in discussions about terminology and this one tends in that direction. Practically there’s no difference between saying that quantum probabilities are “properties of the system” or “of the predictor”. Either we can predict, or not, and that’s all what matters. Beware of the clause “in principle”, as it often only obscures the debate.
Edit: to formulate it a little bit differently, predictability is an instance of regularity in the universe, i.e. our ability to compress the data of the whole history of the universe into some brief set of laws and possibly not so brief set of initial conditions, nevertheless much smaller amount of information that the history of the universe recorded at each point and time instant. As we do not have this huge pack of information and thus can’t say to what extent it is compressible, we use theories that are based much on induction, which itself is a particular bias. We don’t know even whether the theories we use apply at any time and place, of for any system universally. Frequentist seem to distinguish this uncertainty—which they largely ignore in practice—from uncertainty as a property of the system. So, as I understand the state of affairs, a frequentist is satisfied with a theory (which is a comprimation algorithm applicable to the information about the universe) which includes calling the random number generator at some occasions (e.g. when dealing with dice or electrons), and such induced uncertainty he calls “property of the system”. On the other hand, the uncertainty about the theory itself is a different kind of “meta-uncertainty”.
The Bayesian approach seems to me more elegant (and Occam-razor friendly) as it doesn’t introduce different sorts of uncertainties. It also fits better with the view of physical laws as comprimation algorithms, as it doesn’t distinguish between data and theories with regard to their uncertainty. One may just accept that the history of universe needn’t be compressible to data available at the moment, and use induction to estimate future states of the world in the same way as one estimates limits of validity of presently formulated physical laws.
Finally, the electron is found at some certain polarisation. You just don’t know which before actually doing the experiment (same as for the coin) and you can’t make in principle (at least according to present model of physics—don’t forget that non-local hidden variables are not ruled out) any observation which tells you the result with more certainty in advance (for coin you can). So, the difference is that the future of a classical system can be predicted with unlimited certainty from its present state, while for quantum system not so. This doesn’t necessarily mean that the future is not determined. One can adopt the viewpoint (I think that it was even suggested on OB/LW in Eliezer’s posts about timeless physics) that future is symmetric to the past—it exists in the whole history of universe, and if we don’t know it now, it’s our ignorance. I suppose you would agree that not knowing about the electron’s past is a matter of our ignorance rather than a property of the electron itself, without regard to whether we are able to calculate it from presently available information, even in principle (i.e. using present theories).
I also think that it has little merit to engage in discussions about terminology and this one tends in that direction. Practically there’s no difference between saying that quantum probabilities are “properties of the system” or “of the predictor”. Either we can predict, or not, and that’s all what matters. Beware of the clause “in principle”, as it often only obscures the debate.
Edit: to formulate it a little bit differently, predictability is an instance of regularity in the universe, i.e. our ability to compress the data of the whole history of the universe into some brief set of laws and possibly not so brief set of initial conditions, nevertheless much smaller amount of information that the history of the universe recorded at each point and time instant. As we do not have this huge pack of information and thus can’t say to what extent it is compressible, we use theories that are based much on induction, which itself is a particular bias. We don’t know even whether the theories we use apply at any time and place, of for any system universally. Frequentist seem to distinguish this uncertainty—which they largely ignore in practice—from uncertainty as a property of the system. So, as I understand the state of affairs, a frequentist is satisfied with a theory (which is a comprimation algorithm applicable to the information about the universe) which includes calling the random number generator at some occasions (e.g. when dealing with dice or electrons), and such induced uncertainty he calls “property of the system”. On the other hand, the uncertainty about the theory itself is a different kind of “meta-uncertainty”.
The Bayesian approach seems to me more elegant (and Occam-razor friendly) as it doesn’t introduce different sorts of uncertainties. It also fits better with the view of physical laws as comprimation algorithms, as it doesn’t distinguish between data and theories with regard to their uncertainty. One may just accept that the history of universe needn’t be compressible to data available at the moment, and use induction to estimate future states of the world in the same way as one estimates limits of validity of presently formulated physical laws.