That’s alright! And I’m happy some of it resonated with you.
While science seems to solve a lot of problems, I think it creates new problems as it solves the old ones, and that most of the problems were trying to solve now wouldn’t be as bad if they weren’t amplified by technology. I think science will always both solve and create problems, and that this is unavoidable because science itself is unbiased and free-for-all, and that they problems will get worse since technology is a power-amplifier (by the way, as technology allows for stronger tools over time, society needs more laws and restrictions in order to keep people from being able to harm one another, and if you try to design a video game in which players progress like this you will notice that no player is going to enjoy the end-game)
I also don’t think we should give science the entire credit. The knowledge of humanity is mainly improved by few, extremely intelligent people (Einstein, Newton, Tesla, Hawking, and so on). Even before the scientific method, a few highly influential people accounted for most changes in the world. Most peoples education consist of studying other peoples discoveries and theories, with the hope that they can reach a level where they provide more benefit than harm. This almost makes common people sound superficial, but that’s also because science is getting harder to use. 200 years ago you’d likely be alright if you could operate a shovel and a wheelbarrow, whereas you’re at a disadvantage today if you can’t make it through college.
It is impossible to understand the world. At best you can make a mental model which can predict it because, in some sense, your internal model is a bisimulation (I’m not sure if that’s the right term, but the idea should be close). We refine theories so that they predict the world with less and less error. But the human brain already does this by its own, and intelligence isn’t actually required, nor is understanding. It’s just trial and error, you try things, and if their outcome is good, you keep them, and otherwise you get rid of them. But isn’t this how darwinism works? And how cultures and traditions originate? But they don’t need to know why something works, only that it does. We don’t give tradition much credit because traditional explanations are irrational, but as far as pure results go I think traditions do quite well. Most arguments against tradition aren’t rational but rather moralistic, and society doesn’t seem aware of this, but morality and truth are at great conflict.
Besides the psychological consequences of understanding something (a kind of disillusionment), I think Moloch might be the greater danger. Moloch seems to me a result of legibility, and glorifying science makes people think that legibility (and order) are good-in-themselves, i.e. something where more is always better. This is not the case. I didn’t yet know this when I wrote my previous comment, but the issue is known as “high modernity”. Nassim Taleb has written about how forcing orderliness on society is dangerous, and it’s my personal belief that a lack of legibility is what holds back Moloch. The game theoretical collapse of society is only happening now, in the modern age, since the modern age is what has made it possible, by creating enough order and simplicity (or given the illusion of these) that we now have enough information available that the dilemmas are visible. And now that they’re visible you’re either required to join them or to put yourself at a disadvantage by not joining them. I believe that online social media is unhealthy, whereas real-life socialization is often healthy, because the latter is more chaotic (literally) and has less visible metrics that one could start optimizing for.
To generalize, most undesirable mechanics in life are caused by excess legibility, and by the belief that something singular is good and ought to be optimized for. Rationality, legibility, morality, equality, happiness… Whatever metric we choose, the outcome will be terrible. The alignment problem in AI should perhaps teach us that focusing on anything singular is bad, i.e. that balance is key. But Taoists knew this more than 2500 years ago. I’ve often been told that religious people are just stupid and that the category of people who speak of Heaven/Nature/GNON and warn against “playing god” don’t include hyper intelligent people, but from my limited experience with rationalism and science, this is wrong.
Of course, this community will largely disagree with me, since it believes that society is clearly improving while I believe it’s clearly getting worse.
That’s alright! And I’m happy some of it resonated with you.
While science seems to solve a lot of problems, I think it creates new problems as it solves the old ones, and that most of the problems were trying to solve now wouldn’t be as bad if they weren’t amplified by technology. I think science will always both solve and create problems, and that this is unavoidable because science itself is unbiased and free-for-all, and that they problems will get worse since technology is a power-amplifier (by the way, as technology allows for stronger tools over time, society needs more laws and restrictions in order to keep people from being able to harm one another, and if you try to design a video game in which players progress like this you will notice that no player is going to enjoy the end-game)
I also don’t think we should give science the entire credit. The knowledge of humanity is mainly improved by few, extremely intelligent people (Einstein, Newton, Tesla, Hawking, and so on). Even before the scientific method, a few highly influential people accounted for most changes in the world. Most peoples education consist of studying other peoples discoveries and theories, with the hope that they can reach a level where they provide more benefit than harm. This almost makes common people sound superficial, but that’s also because science is getting harder to use. 200 years ago you’d likely be alright if you could operate a shovel and a wheelbarrow, whereas you’re at a disadvantage today if you can’t make it through college.
It is impossible to understand the world. At best you can make a mental model which can predict it because, in some sense, your internal model is a bisimulation (I’m not sure if that’s the right term, but the idea should be close). We refine theories so that they predict the world with less and less error. But the human brain already does this by its own, and intelligence isn’t actually required, nor is understanding. It’s just trial and error, you try things, and if their outcome is good, you keep them, and otherwise you get rid of them. But isn’t this how darwinism works? And how cultures and traditions originate? But they don’t need to know why something works, only that it does. We don’t give tradition much credit because traditional explanations are irrational, but as far as pure results go I think traditions do quite well. Most arguments against tradition aren’t rational but rather moralistic, and society doesn’t seem aware of this, but morality and truth are at great conflict.
Besides the psychological consequences of understanding something (a kind of disillusionment), I think Moloch might be the greater danger. Moloch seems to me a result of legibility, and glorifying science makes people think that legibility (and order) are good-in-themselves, i.e. something where more is always better. This is not the case. I didn’t yet know this when I wrote my previous comment, but the issue is known as “high modernity”. Nassim Taleb has written about how forcing orderliness on society is dangerous, and it’s my personal belief that a lack of legibility is what holds back Moloch. The game theoretical collapse of society is only happening now, in the modern age, since the modern age is what has made it possible, by creating enough order and simplicity (or given the illusion of these) that we now have enough information available that the dilemmas are visible. And now that they’re visible you’re either required to join them or to put yourself at a disadvantage by not joining them. I believe that online social media is unhealthy, whereas real-life socialization is often healthy, because the latter is more chaotic (literally) and has less visible metrics that one could start optimizing for.
To generalize, most undesirable mechanics in life are caused by excess legibility, and by the belief that something singular is good and ought to be optimized for. Rationality, legibility, morality, equality, happiness… Whatever metric we choose, the outcome will be terrible. The alignment problem in AI should perhaps teach us that focusing on anything singular is bad, i.e. that balance is key. But Taoists knew this more than 2500 years ago. I’ve often been told that religious people are just stupid and that the category of people who speak of Heaven/Nature/GNON and warn against “playing god” don’t include hyper intelligent people, but from my limited experience with rationalism and science, this is wrong.
Of course, this community will largely disagree with me, since it believes that society is clearly improving while I believe it’s clearly getting worse.