I’ve tried my best not to frontally engage any of the internal techniques or justifications of rationality. On what Robin Hanson would call an inside view, rationality looks very, very attractive, even to me. By design, I have not argued here that, e.g., it is difficult to revive a frozen human brain, or that the FDA is the best judge of which drugs are safe.
To be honest, I think you should have. Meta-arguments for why the causes of our beliefs are suspect are never going to be as convincing as evidence for why the beliefs themselves are wrong.
Also, I think the parts about psychoactive drugs are somewhere between off-topic and a straw man. One of the posts you linked is titled “Coffee: When it helps, when it hurts”—two sides of an argument for a stimulant that probably a supermajority of adults use regularly. In another, 2 of the 18 suggestions offered involve substance use.
Thirdly, while rationality in the presence of akrasia does not have amazing effects on making us more effective, rationality does have one advantage that’s been overlooked a lot lately: it results in true beliefs. Some people, myself included, value this for its own sake, and it is a real benefit.
Meta-arguments for why the causes of our beliefs are suspect are never going to be as convincing as evidence for why the beliefs themselves are wrong.
However, even if the beliefs are correct, many people will still accept them for the wrong reasons. These “meta-arguments” are powerful psychological forces, which affect all people.
I would suspect that LW has a small bunch of people who have arrived at LW:ish beliefs because of purely rational reasoning. There’s a larger group that has arrived at the same beliefs ~purely because of human biases (including the factors listed in the post). And then there’s a larger yet group that has arrived at them partially because of rational reasoning, and partially because of biases.
...rationality does have one advantage that’s been overlooked a lot lately: it results in true beliefs. Some people, myself included, value this for its own sake, and it is a real benefit.
Not really. I have been saying the same as you, that a true belief is valueable in and of itself, even if you don’t like the consequences. But I don’t believe that to be true anymore. As Roko once wrote, “I wish I would have never learnt about existential risks”.
I also used to feel very optimistic and excited about ‘true beliefs’, believing having more of them would represent such incredible progress, but now I only have the memory of valuing them, and continue to pursue them a little out of discipline and habit. Scientific belief is an exception, but regarding anything that I would call ‘philosophical’ (for lack of a better word), pursuing true belief seems empty after all.
My reasons for this is that I thought ‘true belief’ (how I define it, as a collection of metaphysical/philosophical ideas) would reflect some kind of reality (for example, a framework of objective value) but since such ideas aren’t entangled with reality, they don’t matter.
By the way, I consider 3^^^^3 years from now to be not entangled with reality. Having just read through your link and the helpful comments people made throughout, could you comment on which advice was most immediately helpful, or having you found any temporary or ameliorating patches since then?
Having just read through your link and the helpful comments people made throughout, could you comment on which advice was most immediately helpful, or having you found any temporary or ameliorating patches since then?
I would have to read the replies again to give a definite answer, but mostly I now reason along the lines of this comment.
To be honest, I think you should have. Meta-arguments for why the causes of our beliefs are suspect are never going to be as convincing as evidence for why the beliefs themselves are wrong.
Also, I think the parts about psychoactive drugs are somewhere between off-topic and a straw man. One of the posts you linked is titled “Coffee: When it helps, when it hurts”—two sides of an argument for a stimulant that probably a supermajority of adults use regularly. In another, 2 of the 18 suggestions offered involve substance use.
Thirdly, while rationality in the presence of akrasia does not have amazing effects on making us more effective, rationality does have one advantage that’s been overlooked a lot lately: it results in true beliefs. Some people, myself included, value this for its own sake, and it is a real benefit.
However, even if the beliefs are correct, many people will still accept them for the wrong reasons. These “meta-arguments” are powerful psychological forces, which affect all people.
I would suspect that LW has a small bunch of people who have arrived at LW:ish beliefs because of purely rational reasoning. There’s a larger group that has arrived at the same beliefs ~purely because of human biases (including the factors listed in the post). And then there’s a larger yet group that has arrived at them partially because of rational reasoning, and partially because of biases.
Maybe I should drop the stimulants! What other advice have you noticed on Less Wrong?
Not really. I have been saying the same as you, that a true belief is valueable in and of itself, even if you don’t like the consequences. But I don’t believe that to be true anymore. As Roko once wrote, “I wish I would have never learnt about existential risks”.
I also used to feel very optimistic and excited about ‘true beliefs’, believing having more of them would represent such incredible progress, but now I only have the memory of valuing them, and continue to pursue them a little out of discipline and habit. Scientific belief is an exception, but regarding anything that I would call ‘philosophical’ (for lack of a better word), pursuing true belief seems empty after all.
My reasons for this is that I thought ‘true belief’ (how I define it, as a collection of metaphysical/philosophical ideas) would reflect some kind of reality (for example, a framework of objective value) but since such ideas aren’t entangled with reality, they don’t matter.
By the way, I consider 3^^^^3 years from now to be not entangled with reality. Having just read through your link and the helpful comments people made throughout, could you comment on which advice was most immediately helpful, or having you found any temporary or ameliorating patches since then?
I would have to read the replies again to give a definite answer, but mostly I now reason along the lines of this comment.