I like the concept, but I don’t want to make “information dark matter” anymore accessible, and I think it’s a bad idea to think about how it may be done. It seems like the kind of optimization which increases the legibility of society, and which makes it easy to automate and exploit things, and perhaps most importantly, allows an authority which doesn’t want these things to exist, to find them and remove them effectively.
I will not be able to defend this point very well, as I don’t think words exist for most of the ideas that I have in mind. I can only vaguely point at it, with patterns which are possibly so abstract that they don’t make sense to most people:
What’s optimal is context-dependent, and thus local. If you read 100 books, and you write your own book, the resulting book will be smaller (and not only because redundancies have been removed, or because you’re thrown away objectively wrong information). Because of a fractal-like structure, mutually exlusive information can exist as long as it’s isolated. When you force interaction between different things, you basically destroy their differences, arriving at a more “general” result which is their average or their intersection. Enforcing any kind of consensus is therefore destructive.
I don’t belive that sharing any positive things really increase them. If anything, it probably makes for a more even distribution, or a faster exhaustion of what’s good (everything good is a resource, every resource is limited). In fact, the “goodness” of a resource is probably a byproduct of burning that resource, so that what we consider good is actually the destruction of good. If you watch a good movie, then you’re having fun, but if you watch it again you will find that it’s less fun than the first time. I think your concept of “value” might have similar issues, and that “efficiency” might just mean “exploitability” or “step-size taken towards equilibrium”. Nothing is entirely good or entirely bad, and I’d like to warn against optimizing towards any metrics whatsoever. I think the alignment problem shows that we fundamentally misunderstand what optimization is. We cannot come up with a single metric for what to optimize for doesn’t result in the destruction of society, and I doubt this is merely because we’re having trouble defining “ethical”, “good”, moral”, etc.
The idea mixing things is good is antibiological, and people who come to such conclusions tend to be, from my own observation, transhumanist in some sense (weaker sense of shame, less of a need for personal space, dislike of discrimination, dislike of borders, little need for privacy, smaller amygdalas, etc) We treat family differently than strangers, and we tell information to our friends which we wouldn’t tell others. This all creates a kind of “closedness” and I very much doubt that we evolved this tendency just because we’re irrational beings who don’t want a prosperous society. Even this community is a bit of a walled garden, and it’s precisely because walls, or at least discriminatory access, is essential for the survivability of a system (and I do see the humor in the possibility of getting banned or rate-limited for this comment)
It’s often more frustrating to interact with bigger companies, since you’ll encounter bureaucracy and automated systems which never have the answer you’re looking for. It’s only when you can get in contact with another human being that the process tends to be pleasing. But mindsets in which optimization, globalization, automation and other such things are regarded as positive, cause the world to tend towards a less biological/human state. Even though society is getting more connected, loneliness is on the raise, and it’s because the efficiency of the system is optimizing away human interactions and freedom. This is another reason I’d like to reduce the systemization of society.
I like the concept, but I don’t want to make “information dark matter” anymore accessible, and I think it’s a bad idea to think about how it may be done. It seems like the kind of optimization which increases the legibility of society, and which makes it easy to automate and exploit things, and perhaps most importantly, allows an authority which doesn’t want these things to exist, to find them and remove them effectively.
I will not be able to defend this point very well, as I don’t think words exist for most of the ideas that I have in mind. I can only vaguely point at it, with patterns which are possibly so abstract that they don’t make sense to most people:
What’s optimal is context-dependent, and thus local. If you read 100 books, and you write your own book, the resulting book will be smaller (and not only because redundancies have been removed, or because you’re thrown away objectively wrong information). Because of a fractal-like structure, mutually exlusive information can exist as long as it’s isolated. When you force interaction between different things, you basically destroy their differences, arriving at a more “general” result which is their average or their intersection. Enforcing any kind of consensus is therefore destructive.
I don’t belive that sharing any positive things really increase them. If anything, it probably makes for a more even distribution, or a faster exhaustion of what’s good (everything good is a resource, every resource is limited). In fact, the “goodness” of a resource is probably a byproduct of burning that resource, so that what we consider good is actually the destruction of good. If you watch a good movie, then you’re having fun, but if you watch it again you will find that it’s less fun than the first time. I think your concept of “value” might have similar issues, and that “efficiency” might just mean “exploitability” or “step-size taken towards equilibrium”. Nothing is entirely good or entirely bad, and I’d like to warn against optimizing towards any metrics whatsoever. I think the alignment problem shows that we fundamentally misunderstand what optimization is. We cannot come up with a single metric for what to optimize for doesn’t result in the destruction of society, and I doubt this is merely because we’re having trouble defining “ethical”, “good”, moral”, etc.
The idea mixing things is good is antibiological, and people who come to such conclusions tend to be, from my own observation, transhumanist in some sense (weaker sense of shame, less of a need for personal space, dislike of discrimination, dislike of borders, little need for privacy, smaller amygdalas, etc)
We treat family differently than strangers, and we tell information to our friends which we wouldn’t tell others. This all creates a kind of “closedness” and I very much doubt that we evolved this tendency just because we’re irrational beings who don’t want a prosperous society. Even this community is a bit of a walled garden, and it’s precisely because walls, or at least discriminatory access, is essential for the survivability of a system (and I do see the humor in the possibility of getting banned or rate-limited for this comment)
It’s often more frustrating to interact with bigger companies, since you’ll encounter bureaucracy and automated systems which never have the answer you’re looking for. It’s only when you can get in contact with another human being that the process tends to be pleasing. But mindsets in which optimization, globalization, automation and other such things are regarded as positive, cause the world to tend towards a less biological/human state. Even though society is getting more connected, loneliness is on the raise, and it’s because the efficiency of the system is optimizing away human interactions and freedom. This is another reason I’d like to reduce the systemization of society.