What is ‘knowing’? This is not an arbitrary question. They say that curiosity is wanting to know. Making new considerations makes you more efficient. Someone who wants to become more efficient will therefor look for new considerations and behave as if curious.
Bayesians appear to perform well epistemically, allegedly because they feel a desire to know; because they feel curiosity. But I expect Bayesians would perform just as well if not better if they desired to be efficient, which, again, will look approximately identical to curiosity.
Another type of reasoner may desire to get closer to omniscience. Omniscience means having awareness of all information.
If you admit you are not omniscient about a question, then you admit there is more information that could shift your belief, which would make you more efficient.
Yet those who allegedly desire to know seemingly claim to often know the correct answers to questions even when they are not omniscient about those questions. What could they mean by ‘know’? Are they wrong that they know? Do they have sufficientsemiscience in some sense? How could sufficient-semiscience-for-knowledge be defined in a way that can’t be gamed?
I suspect there is no such ungameable definition. If there is no ungameable definition, then it would seem ‘desire to know’ is also gameable if it does not mean ‘desire to consider everything’ (become omniscient).
What is ‘knowing’? This is not an arbitrary question. They say that curiosity is wanting to know. Making new considerations makes you more efficient. Someone who wants to become more efficient will therefor look for new considerations and behave as if curious.
Bayesians appear to perform well epistemically, allegedly because they feel a desire to know; because they feel curiosity. But I expect Bayesians would perform just as well if not better if they desired to be efficient, which, again, will look approximately identical to curiosity.
Another type of reasoner may desire to get closer to omniscience. Omniscience means having awareness of all information.
If you admit you are not omniscient about a question, then you admit there is more information that could shift your belief, which would make you more efficient.
Yet those who allegedly desire to know seemingly claim to often know the correct answers to questions even when they are not omniscient about those questions. What could they mean by ‘know’? Are they wrong that they know? Do they have sufficient semiscience in some sense? How could sufficient-semiscience-for-knowledge be defined in a way that can’t be gamed?
I suspect there is no such ungameable definition. If there is no ungameable definition, then it would seem ‘desire to know’ is also gameable if it does not mean ‘desire to consider everything’ (become omniscient).
k was I downvoted for being too Waluigi or did I do something wrong