We don’t observe “most things” in the first place. We see a miniscule subset of all things—a subset which is either the target or result of a maximisation process.
IMO, most things that look as though as though they are being maximised are, in fact, the products of instrumental maximisation—and so are not something that we need to include in the preferences of a machine intelligence—because we don’t really care about them in the first place. The products of instrumental maximisation often don’t look much like “by-products”. They often look more as though they are intrinsic preferences. However, they are not.
We don’t observe “most things” in the first place. We see a miniscule subset of all things—a subset which is either the target or result of a maximisation process.
IMO, most things that look as though as though they are being maximised are, in fact, the products of instrumental maximisation—and so are not something that we need to include in the preferences of a machine intelligence—because we don’t really care about them in the first place. The products of instrumental maximisation often don’t look much like “by-products”. They often look more as though they are intrinsic preferences. However, they are not.