In no particular order, because interestingness is multi-dimensional and they are probably all to some degree on my personal interesting Pareto frontier:
Almost everything is causally linked, saying “A has no effect on B” is almost always wrong (unless you very deliberately search for A and B that fundamentally cannot be causally linked). If you ran a study with a bazillion subjects for long enough, practically anything you can measure would reach statistical significance
Many disagreements are just disagreements about labels (“LLMs are not truly intelligent”, “Free will does not exist”) and can be easily resolved / worked around once you realize this (see also)
Selection biases of all kind
Intentionality bias, it’s easy to explain human behavior with supposed intentions, but there is much more randomness and ignorance everywhere than we think
Extrapolations tend to work locally, but extrapolating further into the future very often gets things wrong; kind of obvious, applies to e.g. resource shortages (“we’ll run out of X and then there won’t be any X anymore!”), but also Covid (I kind of assumed Covid cases would just exponentially climb until everything went to shit, and forgot to take into account that people would get afraid and change their behavior on a societal scale, at least somewhat, and politicians would eventually do things, even if later than I would), and somewhat AI (we likely won’t just “suddenly” end up with a flawless superintelligence)
“If only I had more time/money/whatever” style thinking is often misguided, as often when people say/think this, the sentence continues with “then I could spend that time/money/whatever in other/more ways than currently”, meaning as soon as you get more of X, you would immediately want to spend it, so you’ll never sustainably end up in a state of “more X”. So better get used to X being limited and having to make trade-offs and decisions on how to use that limited resource rather than daydreaming about a hypothetical world of “more X”. (This does not mean you shouldn’t think about ways to increase X, but you should probably distance yourself from thinking about a world in which X is not limited)
Life is basically an ongoing coordination problem between your past/present/future selves
The realization that we’re not smart enough to be true consequentialists, i.e. consequentialism is somewhat self-defeating
The teleportation paradox, and thinking about a future world in which a) teleportation is just a necessity to be successful in society (and/or there is just social pressure, e.g. all your friends do it and you get excluded from doing cool things if you don’t join in) and b) anyone having teleported before having convincing memories of having gone through teleportation and coming out on the other side. In such a world, anyone with worries about teleportation would basically be screwed. Not sure if I should believe in any kind of continuity of consciousness, but that certainly feels like a thing. So I’d probably prefer not to be forced to give that up just because the societal trajectory happens to lead through ubiquitous teleportation.
In no particular order, because interestingness is multi-dimensional and they are probably all to some degree on my personal interesting Pareto frontier:
We’re not as 3-dimensional as we think
Replacing binary questions with “under which circumstances”
Almost everything is causally linked, saying “A has no effect on B” is almost always wrong (unless you very deliberately search for A and B that fundamentally cannot be causally linked). If you ran a study with a bazillion subjects for long enough, practically anything you can measure would reach statistical significance
Many disagreements are just disagreements about labels (“LLMs are not truly intelligent”, “Free will does not exist”) and can be easily resolved / worked around once you realize this (see also)
Selection biases of all kind
Intentionality bias, it’s easy to explain human behavior with supposed intentions, but there is much more randomness and ignorance everywhere than we think
Extrapolations tend to work locally, but extrapolating further into the future very often gets things wrong; kind of obvious, applies to e.g. resource shortages (“we’ll run out of X and then there won’t be any X anymore!”), but also Covid (I kind of assumed Covid cases would just exponentially climb until everything went to shit, and forgot to take into account that people would get afraid and change their behavior on a societal scale, at least somewhat, and politicians would eventually do things, even if later than I would), and somewhat AI (we likely won’t just “suddenly” end up with a flawless superintelligence)
“If only I had more time/money/whatever” style thinking is often misguided, as often when people say/think this, the sentence continues with “then I could spend that time/money/whatever in other/more ways than currently”, meaning as soon as you get more of X, you would immediately want to spend it, so you’ll never sustainably end up in a state of “more X”. So better get used to X being limited and having to make trade-offs and decisions on how to use that limited resource rather than daydreaming about a hypothetical world of “more X”. (This does not mean you shouldn’t think about ways to increase X, but you should probably distance yourself from thinking about a world in which X is not limited)
Taleb’s Extremistan vs Mediocristan model
+1 to Minimalism that lsusr already mentioned
The mindblowing weirdness of very high-dimensional spaces
Life is basically an ongoing coordination problem between your past/present/future selves
The realization that we’re not smart enough to be true consequentialists, i.e. consequentialism is somewhat self-defeating
The teleportation paradox, and thinking about a future world in which a) teleportation is just a necessity to be successful in society (and/or there is just social pressure, e.g. all your friends do it and you get excluded from doing cool things if you don’t join in) and b) anyone having teleported before having convincing memories of having gone through teleportation and coming out on the other side. In such a world, anyone with worries about teleportation would basically be screwed. Not sure if I should believe in any kind of continuity of consciousness, but that certainly feels like a thing. So I’d probably prefer not to be forced to give that up just because the societal trajectory happens to lead through ubiquitous teleportation.
+1 to Taleb’s Extremistan vs Mediocristan model