You have a point, even if it is expressed in a hostile manner.
However, from the outside it is often hard to tell whether something is a goal or a side effect. It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible, even though fossil fuel burning sometimes has to be delayed to produce enough tech to get access to more fossil fuels. Or you can pick some other artifact of human development, and in the right light it would look like maximizing it is a terminal goal. Like the number of heavy objects in the air at any given time.
I’m sorry, I’m not trying to be hostile. But words have meanings. If you could equate “Y has been observed to produce some X among a myriad of other effects” with “Y is an X maximiser”, what’s the point in having a word like “maximiser”? Hell, even remove the “myriad of other effects”—a paperclip-making machine or paperclip factory isn’t a paperclip maximiser either.
It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible
It certainly does not.
Well, maybe to a very contrived observer: you’d have to have all the knowledge about our planet necessary to realize that a CO2 increase is happening (not trivial) and that it’s not a natural effect of whatever changes the planet naturally undergoes (even less trivial), and somehow magically be ignorant of any other details, to even entertain such a notion. Any more knowledge and you’d immediately begin noticing that our civilisation produces a myriad of effects that it would not bother producing if it were a CO2 maximiser, and that for all the effort and ingenuity that it puts into its works, the effects in terms of C02 increase are actually rather pathetic.
You’re closer to a reasonable use of the term by calling the paperclip-producing advanced civilisation an incidental paperclip maximiser, because the end result will be the same—all matter eventually converted into paperclips. It’s still a stretch, though, because a maximiser would take the shortest route towards tiling the universe with paperclips, while the advanced civilisation will be actively trying to minimise paperclipping in proportion to its actual goals—it will try to extract as much usefulness out of every bit of matter converted as it can. So it’s still an incidental producer, not maximiser. Would an outside observer be able to tell the difference? I don’t know, but I suggest the way this civilisation would be doing a myriad of interesting things instead of simply focusing on the most efficient way to produce paperclips would be an easy giveaway.
Of course if we only look at end results to decide if we call something a “maximiser”, then any agent actively pursuing any goals is an “entropy maximiser”. At this point I stop feeling like language conveys useful meaning.
if we meet what looks like an alien Clippy, we should not jump to conclusions that paperclip maximizing is its terminal value.
Yes, certainly. It seems to me that the thought you were expressing is actually the opposite of those of your words I’ve been objecting to: if something looks like a maximiser, it’s possibly it isn’t one.
“humans are accidental CO2 maximizers”
You’re abusing words. There’s a big difference between a producer of X and a maximiser of X.
You have a point, even if it is expressed in a hostile manner.
However, from the outside it is often hard to tell whether something is a goal or a side effect. It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible, even though fossil fuel burning sometimes has to be delayed to produce enough tech to get access to more fossil fuels. Or you can pick some other artifact of human development, and in the right light it would look like maximizing it is a terminal goal. Like the number of heavy objects in the air at any given time.
Not sure about that.
I’m sorry, I’m not trying to be hostile. But words have meanings. If you could equate “Y has been observed to produce some X among a myriad of other effects” with “Y is an X maximiser”, what’s the point in having a word like “maximiser”? Hell, even remove the “myriad of other effects”—a paperclip-making machine or paperclip factory isn’t a paperclip maximiser either.
It certainly does not.
Well, maybe to a very contrived observer: you’d have to have all the knowledge about our planet necessary to realize that a CO2 increase is happening (not trivial) and that it’s not a natural effect of whatever changes the planet naturally undergoes (even less trivial), and somehow magically be ignorant of any other details, to even entertain such a notion. Any more knowledge and you’d immediately begin noticing that our civilisation produces a myriad of effects that it would not bother producing if it were a CO2 maximiser, and that for all the effort and ingenuity that it puts into its works, the effects in terms of C02 increase are actually rather pathetic.
You’re closer to a reasonable use of the term by calling the paperclip-producing advanced civilisation an incidental paperclip maximiser, because the end result will be the same—all matter eventually converted into paperclips. It’s still a stretch, though, because a maximiser would take the shortest route towards tiling the universe with paperclips, while the advanced civilisation will be actively trying to minimise paperclipping in proportion to its actual goals—it will try to extract as much usefulness out of every bit of matter converted as it can. So it’s still an incidental producer, not maximiser. Would an outside observer be able to tell the difference? I don’t know, but I suggest the way this civilisation would be doing a myriad of interesting things instead of simply focusing on the most efficient way to produce paperclips would be an easy giveaway.
Of course if we only look at end results to decide if we call something a “maximiser”, then any agent actively pursuing any goals is an “entropy maximiser”. At this point I stop feeling like language conveys useful meaning.
Yes, certainly. It seems to me that the thought you were expressing is actually the opposite of those of your words I’ve been objecting to: if something looks like a maximiser, it’s possibly it isn’t one.