You say that people should not be allowed to have their preferences but that they should just have their beliefs (e.g., to have preferences is to have a larger utility)
But there are some important questions which I think are not answered here:
Does this apply to a human brain [1], or to any other species [2] that we aren’t part of? [...]
[...]
In what sense does our decisions make sense if we don’t have a conscious mind?
The problem of having a conscious mind seems like the single most useful thing. The whole “be a conscious being” aspect seems very useful compared to the huge gap between conscious and unconscious minds which otherwise seem to be something like how the brain doesn’t have a conscious mind, but is pretty pretty far off from it.
Of course, you could also try other approaches. Your mind could be a computer or a CPU, or you could try some different approach.
I suggest that maybe having one mental brain that does the opposite something is more useful than one mental brain that “does both things”.
I mean, yeah, agents (like everyone) benefit from social skills.
You say that people should not be allowed to have their preferences but that they should just have their beliefs (e.g., to have preferences is to have a larger utility)
But there are some important questions which I think are not answered here:
Does this apply to a human brain [1], or to any other species [2] that we aren’t part of? [...]
[...]
In what sense does our decisions make sense if we don’t have a conscious mind?
Too real, GPT2, too real.
The problem of having a conscious mind seems like the single most useful thing. The whole “be a conscious being” aspect seems very useful compared to the huge gap between conscious and unconscious minds which otherwise seem to be something like how the brain doesn’t have a conscious mind, but is pretty pretty far off from it.
Of course, you could also try other approaches. Your mind could be a computer or a CPU, or you could try some different approach.
I suggest that maybe having one mental brain that does the opposite something is more useful than one mental brain that “does both things”.
This seems relevant to Qiaochu’s interests.
I can’t think of anything else that is more relevant to Qiaochu and Qiaochu.